Interconnect cables! Lies and myths!

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Sure, but if I look at 5 displays, each fed by a different cable, and they all look identical to me, doesn't that mean by definition that the cable doesn't matter? For THESE displays, sure. They were supposed to be high-end displays, don't remember the type or brand.

Did they by any chance say whether HDCP was being employed? That's a sure-fire way to see if the cable's corrupting any bits - it won't authenticate if there are bit errors.
 
Then certainly you've seen the effect of bad cables and bad connectors.
Yes. How good the cables and connectors should be depend on application and equipment performance. But I'm sure you know that already. Things also become interesting when you have to use multiple displays and sync them for a complete scene. Digital transmission and processing makes problem diagnosis more straight forward.
 
There's always a non-zero probabilty of a bit error. In fact I'd argue that its over-engineered if this probability is too low- error correction needs to be built in to a competently designed (aka optimised) system.

So assuming we've got no bit errors, does anyone have a hypothesis (any hypothesis?) as to how the image can change with different cables?
 
In this case - HDMI cables - the termination is important. Yes, one reason for bit errors is reflections. Another is dispersion - frequency-dependent time of arrival differences. A third is frequency-dependent loss. These all have the tendency to 'narrow down' the eye pattern and hence increase the chance of misreading.
 
I am showing my non digital credentials.

Could the poor termination cause errors that go way beyond "jitter" and become "bit errors"?

Jitter at the input should be correctable within the receiver. i.e. a good receiver will hide the errors created by bad cables.

A badly terminated receiver would require an equally badly terminated cable to reduce the timing errors, eg. a 98ohm cable and connector arriving at a 110ohm receiver will give a poorer signal than a 98ohm cable and 97ohm connector arriving at a 97ohm receiver.
 
Last edited:
Could the poor termination cause errors that go way beyond "jitter" and become "bit errors"?

Jitter's not an issue for HDMI carrying video seeing as there's no temporal reconstruction going on at the receiving end as there is for audio. The pixels stay in the same place even if there's jitter on the bits.

Jitter at the input should be correctable within the receiver. i.e. a good receiver will hide the errors created by bad cables.

Yes. Jitter is called ISI - inter-symbol interference.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.