Is jitter an issue with usb signals ?

Status
Not open for further replies.
Do you know what happens when you drive fast changing differential signal down a transmission line cable? Are you aware of USB transceiver PCB track width / spacing standards, and how would they affect the sound? And for that reason, how would different USB cable, with different characteristics, affect the sound?

I am well aware of the requirements for the physical aspect of the USB bus, having done hundred of the dam things, they are on everything these days...

How would these affect the sound, the data is digital it will either get through or it wont and you will have drop outs, you will not have a slight change in sound because the information is encoded digitally...
The USB characteristics are well defined and USB products comply to the specs, they are not arduous to follow and have a wide latitude of +/-10% for the characteristic impedance for the data lines.... To be honest even a greater drift from ideal is not going to upset things to much for the lower end of USB (including USB 2 full speed), so I wouldn't worry your USB signal wont be able to differentiate from a cheap printer cable at 90R diff. impedance or a £2000 audiophile BS USB cable at 90R diff. impedance.
 
Have you ever used a differential probe, and matched that probe to an appropriate oscilloscope, in your life? Would you even know what to look for, to explain to your brain why cable b) sounds so superior to cable a)

I even model probes so that we can correctly load the signal when doing simulations so the simulated waveforms will match what the engineer sees when he looks at the actual physical signals using a scope it is the same as the simed result, then we can remove the scope probe loading from the sim. and look at the actual waveform.
 
Extreme_Boky said:
Judging from few other threads, you in fact do not have the most basic understanding in electronics.
Thank you for your kind and considerate thoughts.

Do you know what happens when you drive fast changing differential signal down a transmission line cable?
Yes. I suspect you do not.

Are you aware of USB transceiver PCB track width / spacing standards, and how would they affect the sound?
No, but I am aware that a transmission line needs to be continued onto the PCB tracks. I assume that PCB designers know how to do this. Unless very badly done, these would not affect the sound, as the PCB track makes up only a small proportion of the transmission line length.

And for that reason, how would different USB cable, with different characteristics, affect the sound?
Cable is either USB cable or it is not. If not, it may not work - although I guess USB is fairly robust. If USB cable, then it will deliver the data - which is all it is required to do. If some inferior method is used which is sensitive to timing then you should not be using a PC, as they don't do accurate timing.

Have you ever used a differential probe, and matched that probe to an appropriate oscilloscope, in your life?
No, I don't think I have. I'm not sure where this is going: are you going to claim that (misunderstood) experience somehow trumps (correct) knowledge?

Would you even know what to look for, to explain to your brain why cable b) sounds so superior to cable a)
I might read a good book on psychoacoustics, and experimental design, to discover why so many people believe that cable b) sounds superior to cable a) - especially when b) may sometimes be clearly electrically inferior to a).
 
Mass produced, USB specified cable available by the mile cheaply because its used everywhere, cheep cheerful and to spec...
Audiophile cables, lots of BS, lots of profit margin and probably in some cases not to spec especially some of the silly DIY jobs we see on here.....

There is a big difference in "quality" between the bulk cable that is available. Two dominant differences stand out. The gauge used for the +5 and return wires (power) and the impedance of the twisted pair(s). For USB 3.0 you also need good shielding.

I have seen bulk cable that does not meet the requirements for differential impedance and insertion loss. There are so many suppliers in China making bulk cable of all types you need to be able to confirm the vendor is capable of providing what you need. Many lack the equipemnt to fully test their cable.

If you have a device like a HD that draws a lot of current, you want cable with larger gauge power wires. In general this is the biggest problem with cheap USB cables.

USB 2.0 is more forgiving of cable difficiencies as long as DC power loss is not an issue.

If you need good USB 3.0 performance you want all requirements met. If the connectors are crap or care is not taken to maintain good assembly quality then you could have issues meeting full speeds.
 
Last edited:
busaboy said:
I have seen bulk cable that does not meet the requirements for differential impedance and insertion loss.
Then it isn't USB cable, whatever the label says.

The claim being made is that different USB-compliant cables can sound different, and that some expensive ones can sound better than some cheap ones. As this is an extraordinary claim it needs extraordinary evidence. This evidence must come from those making the claim; those doubting this claim do not need to prove anything as we have facts on our side.

Someone may tell me that wearing yellow socks while I cook will improve the flavour of the food. I don't regard this as a claim which I need to test, however much someone else believes it.
 
And chances of cheap cable having a better impedance control than audiophile one?

Chances aren't too bad. It's not like maintaining twisted pair impedance is particularly difficult. And the audiophile cable is likely to be engineered by someone who thinks cryo treating or the like are what makes magic, rather than wire diameter, insulation thickness, and turns-per-inch.
 
Then it isn't USB cable, whatever the label says.

It's whatever the manufacturer wants to call it. Since the vast majority of cheap USB cables use the cheapest bulk cable available made with the least copper I'll bet most of the cables you own don't meet all the requirements. Add to the that the the vast majority of USB cables sold today use micro USB or lightning connectors. You can be assured they use the smallest gauge wire possible. IL and DC losses go up. When you buy a USB 3.0 drive they give you super short cables so the drive will work. Needless to say the cables inside the computer feeding the front panel USB 3.0 connectors burn up some of the margin.

As far as all the crazy talk conflating USB performance with audio performance, I guess that is what psuedo audiophiles like to do. Unless the error rate of the USB channel is crap there will be no impact on audio. Now for a SSD or other device needing a lot of bandwidth, the errors will impact throughput.

Now any digital crap leaking into the analog side or ground loop from the PC will impact audio quality.
 
Last edited:
Now any digital crap leaking into the analog side or ground loop from the PC will impact audio quality.

Contrary to popular belief the designers of PC hardware are pretty smart and competent people. You can bet your bottom dollar that a lot of thought and design experience has gone into making sure that all high frequency currents and return paths are routed and decoupled to keep any kind of radiated or conducted EMI to an absolute minimum.

Keeping the individual functioning blocks within a PC tightly bound to their dedicated PCB area is absolutely necessary to ensure that the PC actually functions. This is important in an arena where who knows what hardware is going to be combined with what else so that it can be guaranteed to work.

Adding to this is the fact that external hardware, such as USB devices etc, are connected with differential pairs, once again chosen to help maintain signal integrity and improve immunity to noise.

There is also this idea that computer PSUs are noisy beasts and contribute significant woes into any audio equipment that might be added to the PC. On the whole this is not true. Modern PSUs tend to use LLC converters for the main power converter for their high efficiency, this also comes with the bonus of soft switching that actually reduces EMI vs other switching topologies. LLCs are sometimes found in audiophile amplifiers without detriment to the quality of said product.

It also goes without saying (and this has been mentioned in this thread already) that there are many soundcards out there with exemplary measured performance. Signal to noise ratios of ~120dB and distortion in the 0.000x realm.
 
Good. Please prove it. We await with interest. I posted the jitter spectrum of the non async, USB powered ODAC for your comments as well.

Thinking more and writing less may just help you get your point across clearly as at the moment I am not sure, especially when you start talking about ADC jitter.

There are a number of USB DACs with integrated bit-perfect test, like Naim V1 for example.

We offer an XTOS module (USB to Toslink converter) that has integrated bit-perfect test. It could be used as test tool to check bit-perfect playback on digital audio systems.

Here is how it works:

EC designs - XTOS Info

Click “Bitperfect test explained”

The bit-perfect test CD images can be downloaded here:

EC designs - XTOS Info


When we have a recording and want to digitize it (ADC), we need a masterclock and it will contain phase noise just like the masterclock used in a DAC.

Suppose the analogue, bandlimited input signal is just rising when we take a sample (at the wrong moment, so -not- at the theoretical correct moment of sampling).

When the sample is taken slightly too early, the sample value will be slightly too low resulting in a lower sample value (number) being written to the audio file.

When the sample is taken slightly too late, the sample value will be slightly too high resulting in a higher sample value (number) being written to the audio file.

With a zero jitter D/A masterclock we will now experience ripple voltage on the audio signal that represents A/D conversion clock jitter.

When adding specified amount of random D/A conversion clock jitter we can mask this effect to some extent. This has been verified with practical listening tests. What you hear is a significant improvement in perceived sound quality with older CD recordings that are likely to contain more ADC jitter.
 
As far as all the crazy talk conflating USB performance with audio performance, I guess that is what psuedo audiophiles like to do. Unless the error rate of the USB channel is crap there will be no impact on audio.

USB Isochronous packets have a CRC check. Bit errors result in dropped packets, ie, audio dropouts. USB2 UAC2 sends a packet every 125µs, so the dropout will be 125µs long. You can't have a single bit error resulting in corrupted samples once in a while. Either the packet is OK or it isn't.

In practice, it doesn't really happen... everytime an audiophile blames bit errors, look somewhere else 😀

Now, a $0.2 cable with bad shielding, connected at the extremities with skimpy crimping and/or a drain wire, making dubious contact inside a well-worn micro-USB (those things are flimsy as hell)...

Hell. Still no bit errors (yeah, I checked). However, this last case might show some perverse EMI problems (conversion of differential to common mode, noise radiation, etc) ; after all we're talking about rather high freqs...
 
There is a big difference in "quality" between the bulk cable that is available. Two dominant differences stand out. The gauge used for the +5 and return wires (power) and the impedance of the twisted pair(s). For USB 3.0 you also need good shielding.

I have seen bulk cable that does not meet the requirements for differential impedance and insertion loss. There are so many suppliers in China making bulk cable of all types you need to be able to confirm the vendor is capable of providing what you need. Many lack the equipemnt to fully test their cable.

If you have a device like a HD that draws a lot of current, you want cable with larger gauge power wires. In general this is the biggest problem with cheap USB cables.

USB 2.0 is more forgiving of cable difficiencies as long as DC power loss is not an issue.

If you need good USB 3.0 performance you want all requirements met. If the connectors are crap or care is not taken to maintain good assembly quality then you could have issues meeting full speeds.

If the impedance does not match the USB spec then as DF96 has stated it is not a USB cable, same for power, the max current available is in the spec as are the requirements for the cable to deliver this current with minimal voltage drop...
If it is not correctly specked then its not a USB cable...
God forbid if they ever come up with a USB 3 dac, at least with USB 2 some of the silly DIY cables stand a chance of working...
 
It should be made a clearly distiction between bit errors (i.e. the DAC doesn't convert all single data words the computer has spit out 1:1)
and Jitter (i.e. the DAC doesn't accurately place the analog output signals in the time domain).
The former will surely be extremely rare.
The latter is another thing, and should be taken carefully.
 
Ooh it has a light that comes on when you play a test file. So the expectations are set. So does the light go out when you plug a £2.99 ebay USB cable? By your reckoning the light is all you need for nirvana?

As for decorrelating ADC with random jitter (jitter being the boogeyman) good luck with that.
 
It's whatever the manufacturer wants to call it. Since the vast majority of cheap USB cables use the cheapest bulk cable available made with the least copper I'll bet most of the cables you own don't meet all the requirements. Add to the that the the vast majority of USB cables sold today use micro USB or lightning connectors. You can be assured they use the smallest gauge wire possible. IL and DC losses go up. When you buy a USB 3.0 drive they give you super short cables so the drive will work. Needless to say the cables inside the computer feeding the front panel USB 3.0 connectors burn up some of the margin.

As far as all the crazy talk conflating USB performance with audio performance, I guess that is what psuedo audiophiles like to do. Unless the error rate of the USB channel is crap there will be no impact on audio. Now for a SSD or other device needing a lot of bandwidth, the errors will impact throughput.

Now any digital crap leaking into the analog side or ground loop from the PC will impact audio quality.

How bad are the errors for a particular USB interface and cable, not that bad I would strongly suspect.

If you want to learn about analogue/digital layout, ground planes and how to control the noise look up Henry Ott, Ralph Morrison, every Chip manufacturer has numerous documentation on it and plenty of analogue digital designs done on the same board where quite analogue is paramount... look at the performance figures of some sound cards directly connected to the motherboard ground plane.....
 
There are a number of USB DACs with integrated bit-perfect test, like Naim V1 for example.

We offer an XTOS module (USB to Toslink converter) that has integrated bit-perfect test. It could be used as test tool to check bit-perfect playback on digital audio systems.

Here is how it works:

EC designs - XTOS Info

Click “Bitperfect test explained”

The bit-perfect test CD images can be downloaded here:

EC designs - XTOS Info


When we have a recording and want to digitize it (ADC), we need a masterclock and it will contain phase noise just like the masterclock used in a DAC.

Suppose the analogue, bandlimited input signal is just rising when we take a sample (at the wrong moment, so -not- at the theoretical correct moment of sampling).

When the sample is taken slightly too early, the sample value will be slightly too low resulting in a lower sample value (number) being written to the audio file.

When the sample is taken slightly too late, the sample value will be slightly too high resulting in a higher sample value (number) being written to the audio file.

With a zero jitter D/A masterclock we will now experience ripple voltage on the audio signal that represents A/D conversion clock jitter.

When adding specified amount of random D/A conversion clock jitter we can mask this effect to some extent. This has been verified with practical listening tests. What you hear is a significant improvement in perceived sound quality with older CD recordings that are likely to contain more ADC jitter.

If it wasn't bit perfect there would be drop outs so it would be obvious, more sales talk.....
 
I believe the bitperfect test is more targeted towards detecting dodgy software and drivers... The OS and sound driver stack seem to be always prone to do stuff like resampling, mixing, and performing all kinds of operations in your back. In this case, I'd say it's a very useful feature.

A software volume control with improper dithering, or a stray truncation to 16 bits somewhere in the path, or a dodgy resampling algorithm can really screw up the results...
 
Status
Not open for further replies.