I made conclusion that bad cable can influence digital signal...
Of the handful of Toslink cables I keep around my HTPC and DAC won't sync on 192/24 with a few. The pair works fine with others. Cheap doesn't automatically to correlate with good.
Of the handful of Toslink cables I keep around my HTPC and DAC won't sync on 192/24 with a few. The pair works fine with others. Cheap doesn't automatically to correlate with good.
That's how digital is supposed to work - either it works perfectly or it doesn't work at all.
But in practice, that is not the case. My job is to run uncompressed HD video signals around a show. If a cable is bad, it is not all or nothing. Very often we'll get the digital video "sparkles". Any good video tech knows what they look like and knows the cable is on the edge of failing. It suppose it's high error rate, but don't know for sure.
I used to keep a skinny, cheap little Toslink cable for demo purposes. It sounded awful! It was fun to show that "yes, even digital cables can make a difference." A pathologically bad cable.
Please don't infer from the above that I think a digital cable will change EQ or amplitude, such as in the Audio Quest fraud. It's just that they can cause problems before failing completely.
I used to keep a skinny, cheap little Toslink cable for demo purposes. It sounded awful! It was fun to show that "yes, even digital cables can make a difference." A pathologically bad cable.
Please don't infer from the above that I think a digital cable will change EQ or amplitude, such as in the Audio Quest fraud. It's just that they can cause problems before failing completely.
If a cable is bad, it is not all or nothing.
We did a demo at a show of HDMI where each of 4 pairs used a different technology some optical some electrical. Sort of an equivalent to the ADSL over razor wire demo that Alcatel did, for that matter the mud as wires too. Yes, it's either all or nothing.
Last edited:
But in practice, that is not the case. My job is to run uncompressed HD video signals around a show. If a cable is bad, it is not all or nothing. Very often we'll get the digital video "sparkles".
Been there, done that. When this happens I can usually clear the picture up by reducing the resolution of the screen format in terms of lines and pixels per line.
Clearly visible "sparkles" would be one of the several forms of "not working".
The digital data is obviously pretty severely corrupted. If you measured or scoped the signal, its error rate would very likely be relatively huge.
A pathologically bad cable.
Yes.
The equipment at the receiving end, if properly engineered, could detect the digital errors and react appropriately. I suspect that its designers justify not adding the hardware for detecting the digital errors on the grounds that the performance is so obviously wrong when inspected by eye.
The usual claims that I've been referring to are that sound quality - spectral balance or lack of noise and or distortion is greatly auddibly improvedor that picture quality has improved brightness, contrast, or color quality.
What's the sound like when the picture is corrupted this way? I don't happen to know because when I've seen this fault is was on PC video outputs and with PC video monitors that don't have sound.
Last edited:
Good question - we normally don't run audio embedded in the video so I can't say. Might be worth testing. HD-SDI can carry a lot of channels of audio along with the video.
I do see a lot of "No Signal" with bad digital cables. That will depend on the signal bandwidth, the cable and a lot on the receiver. It's easy to demonstrate that a cable working with one device will not work with another.
I do see a lot of "No Signal" with bad digital cables. That will depend on the signal bandwidth, the cable and a lot on the receiver. It's easy to demonstrate that a cable working with one device will not work with another.
I don't always see that, Scott. With marginal cables or connections we can see digital noise in the picture, sometime subtle, sometimes not. It does not look like analog noise. There is picture (it works) but there is distinct digital noise and drop out.Yes, it's either all or nothing.
If you think a digital cable can change dynamics or the freq. response or any other analog parameter of audio, try running the digital signal thru analog proccesors (filters,compressors, delays, etc) and see what you get. It won't be "a tighter low end" or "a more detailed midrange".
I don't always see that, Scott. With marginal cables or connections we can see digital noise in the picture, sometime subtle, sometimes not. It does not look like analog noise. There is picture (it works) but there is distinct digital noise and drop out.
I consider this obvious not working/data loss. A cable swap looking like a 10 degree turn on an old fashion tint knob, no.
Although the knee on losing the link completely is very steep and degradation is anything but graceful, under certain circumstances you can get right on the edge. But as you say, subtle it ain't.
Of course screening could affect how a cable responds in a hostile environment. And construction could affect how it handles being bend/twisted, trodden on. But we all know that and that home audio is hardly hostile.
Of course screening could affect how a cable responds in a hostile environment. And construction could affect how it handles being bend/twisted, trodden on. But we all know that and that home audio is hardly hostile.
OK, I was wondering if that's what you meant. For me that is not an obvious "not working" situation. For me that's "working with noise and errors." There does seem to be a steep cliff with digital. It can go from some sparkle errors to no signal lock at all abruptly. Not much room in between the two, as there is with analog.I consider this obvious not working/data loss. A cable swap looking like a 10 degree turn on an old fashion tint knob, no.
Agree that phase, color and luminance are not affected with digital errors, it just looks like a new type of noise. I looove digital video transmission. Makes my job so much easier.
You probably have 1dB between sparkles and a black screen.
You mean the Shannon SNR/eye pattern ?
No, just C/I performance of the decoder. S/N becomes less useful in the digital domain at least in the stuff I used to play with.
I currently have no way to test that, and doubt I'll be allowed to spend $7000 to buy a nice SDI analyzer. Would be mighty handy, tho.You probably have 1dB between sparkles and a black screen.

I guess I am one of the "duped" consumers. I replaced generic HDMI cables with AQ Coffee level HDMI cables. I noted a immediate improvement in detail and color saturation. Didn't do any audio comparison. I don't want my money back.
I used a cheep 5m HDMI cable for 4k from PC to 43" monitor... works perfectly.......
Is modern video completely digital now? No D/A conversion? How does the process work, where [01011 digital code 01011] ends up as "turn pixel row 586 column 123 to R 255 G 255 B 255"? Surely at the end of the day there's an analog voltage applied to the liquid crystal in each pixel?
If we are agreed - and I hope we are agreed on this in 2016 - that jitter is a real and unwanted phenomena in audio, then can noise-induced timing errors possibly affect the brightness output of a pixel on a screen?
I don't know, but I'll need a more convincing argument than "it's a digital signal so it can't make any difference" before I dismiss the possibility. I've heard that cry before with CD transports in the 1990's and even with CD players (!) in the 1980's.
Jitter is nothing new, just overplayed in the Audio world, like a lot of scare mongering and other myths that appear.....
Digital signals can only travel so far then they to degrade, most interfaces will state the distance they can be transmitted reliably... Eye diagrams clearly show up cable issues and excessive cable length....
I currently have no way to test that, and doubt I'll be allowed to spend $7000 to buy a nice SDI analyzer. Would be mighty handy, tho.Something like the Phabrix SxE generator and meter.
Seems like there may be other ways...
As I mentioned before, I've seen sparkles with some HDMI cables that I could correct by choosing a lower screen resolution.
It seems like an idea of the threshold could be obtained by measuring the amplitude of the HDMI signal when the resolution setting was showing sparkles, versus one where it was not. A voltmeter capable of responding at 500 mHz or higher would be required.
- Status
- Not open for further replies.
- Home
- Member Areas
- The Lounge
- Audio Quest possibly caught in scam