Audio Quest possibly caught in scam

Status
Not open for further replies.
But in practice, that is not the case. My job is to run uncompressed HD video signals around a show. If a cable is bad, it is not all or nothing. Very often we'll get the digital video "sparkles". Any good video tech knows what they look like and knows the cable is on the edge of failing. It suppose it's high error rate, but don't know for sure.

I used to keep a skinny, cheap little Toslink cable for demo purposes. It sounded awful! It was fun to show that "yes, even digital cables can make a difference." A pathologically bad cable.

Please don't infer from the above that I think a digital cable will change EQ or amplitude, such as in the Audio Quest fraud. It's just that they can cause problems before failing completely.
 
But in practice, that is not the case. My job is to run uncompressed HD video signals around a show. If a cable is bad, it is not all or nothing. Very often we'll get the digital video "sparkles".

Been there, done that. When this happens I can usually clear the picture up by reducing the resolution of the screen format in terms of lines and pixels per line.

Clearly visible "sparkles" would be one of the several forms of "not working".

The digital data is obviously pretty severely corrupted. If you measured or scoped the signal, its error rate would very likely be relatively huge.

A pathologically bad cable.

Yes.

The equipment at the receiving end, if properly engineered, could detect the digital errors and react appropriately. I suspect that its designers justify not adding the hardware for detecting the digital errors on the grounds that the performance is so obviously wrong when inspected by eye.

The usual claims that I've been referring to are that sound quality - spectral balance or lack of noise and or distortion is greatly auddibly improvedor that picture quality has improved brightness, contrast, or color quality.

What's the sound like when the picture is corrupted this way? I don't happen to know because when I've seen this fault is was on PC video outputs and with PC video monitors that don't have sound.
 
Last edited:
Good question - we normally don't run audio embedded in the video so I can't say. Might be worth testing. HD-SDI can carry a lot of channels of audio along with the video.

I do see a lot of "No Signal" with bad digital cables. That will depend on the signal bandwidth, the cable and a lot on the receiver. It's easy to demonstrate that a cable working with one device will not work with another.
 
If you think a digital cable can change dynamics or the freq. response or any other analog parameter of audio, try running the digital signal thru analog proccesors (filters,compressors, delays, etc) and see what you get. It won't be "a tighter low end" or "a more detailed midrange".
 
I don't always see that, Scott. With marginal cables or connections we can see digital noise in the picture, sometime subtle, sometimes not. It does not look like analog noise. There is picture (it works) but there is distinct digital noise and drop out.

I consider this obvious not working/data loss. A cable swap looking like a 10 degree turn on an old fashion tint knob, no.
 
Although the knee on losing the link completely is very steep and degradation is anything but graceful, under certain circumstances you can get right on the edge. But as you say, subtle it ain't.

Of course screening could affect how a cable responds in a hostile environment. And construction could affect how it handles being bend/twisted, trodden on. But we all know that and that home audio is hardly hostile.
 
I consider this obvious not working/data loss. A cable swap looking like a 10 degree turn on an old fashion tint knob, no.
OK, I was wondering if that's what you meant. For me that is not an obvious "not working" situation. For me that's "working with noise and errors." There does seem to be a steep cliff with digital. It can go from some sparkle errors to no signal lock at all abruptly. Not much room in between the two, as there is with analog.

Agree that phase, color and luminance are not affected with digital errors, it just looks like a new type of noise. I looove digital video transmission. Makes my job so much easier.
 
Is modern video completely digital now? No D/A conversion? How does the process work, where [01011 digital code 01011] ends up as "turn pixel row 586 column 123 to R 255 G 255 B 255"? Surely at the end of the day there's an analog voltage applied to the liquid crystal in each pixel?

If we are agreed - and I hope we are agreed on this in 2016 - that jitter is a real and unwanted phenomena in audio, then can noise-induced timing errors possibly affect the brightness output of a pixel on a screen?

I don't know, but I'll need a more convincing argument than "it's a digital signal so it can't make any difference" before I dismiss the possibility. I've heard that cry before with CD transports in the 1990's and even with CD players (!) in the 1980's.

Jitter is nothing new, just overplayed in the Audio world, like a lot of scare mongering and other myths that appear.....
 
I currently have no way to test that, and doubt I'll be allowed to spend $7000 to buy a nice SDI analyzer. Would be mighty handy, tho. :up: Something like the Phabrix SxE generator and meter.

Seems like there may be other ways...

As I mentioned before, I've seen sparkles with some HDMI cables that I could correct by choosing a lower screen resolution.

It seems like an idea of the threshold could be obtained by measuring the amplitude of the HDMI signal when the resolution setting was showing sparkles, versus one where it was not. A voltmeter capable of responding at 500 mHz or higher would be required.
 
Status
Not open for further replies.