USB cable quality

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
External hard drive

I haven't read all of the posts in this thread so I don't know if his has already been mentioned. I have a question for all of you computer audiophiles.

I am intending to put all of my digital music on to an external HDD, how important is the USB cable quality between it and my computer if I am storing and playing hi-res files
 
how important is the USB cable quality between it and my computer if I am storing and playing hi-res files


Why should you take others' advice about something that's so easy to test in you setup?

Since i have started using USB music interfaces i find it unwise to use the USB subsystem for data transmission as well. If it has to be local storage, i'd rather use eSATA.

As for the cable type being audible or not in principle... i think it has always been clear that data transmission errors are not an issue. For async dacs, nor is usb clock quality.

So, if USB cables are indeed audible, it is most likely because of radiation noise. And in this case it should make no difference if it is the cable connecting the dac, or the hard drive cable. Mere speculation of course.
 
As for the cable type being audible or not in principle... i think it has always been clear that data transmission errors are not an issue. For async dacs, nor is usb clock quality.

I disagree, I think that data transmission errors are the main source of differences in data cables used to stream audio.

BTW in the cable between HDD and PC, IMHO, you can hardly hear any difference if it is of reasonable quality (i.e. with a really low Bit Error Rate), thanks to error correction.
 
I disagree, I think that data transmission errors are the main source of differences in data cables used to stream audio.

A single incorrect sample value (MSB, MSB-1, modified) will produce a loud click in your stream. Plus you can easily check your assumption - get a USB - SPDIF soundcard, record the digital output using another SPDIF-in card, and compare. The complete properly conducted test will take you just a few hours. You can compare results for various cables. My guess is 99.9% chance is you will get no difference on properly conducted test - bit perfection preserved. I have done similar tests before.

Will you do it?
 
I disagree, I think that data transmission errors are the main source of differences in data cables used to stream audio.



Data transmission errors are super easy to detect. Get a USB board with an spdif output, get the output back into a soundcard card, record a file to compare and voila. Some alignment will probably be required, but other than that it's straightforward. If there were any data errors we would be able to choose usb cables based on that, but alas...

If you have actually performed this test and have found otherwise, please tell.
 
I don't think it's so... never heard about Channel Coding?

The example is flawed

How does this work? There are a whole heap of methods they use and I won’t even scratch the surface. But let me demonstrate one…very basically. Let’s take for example, a number we need to transmit over a channel. Let’s say 17. In binary, 17 is 10001. Let’s assume a one bit error occurs. Because there are 5 digits, we could get 5 possible errors – 10000, 10011, 10101, 11001, 00001. Notice what these numbers equal: 16, 19, 21, 25, 1. A 1 bit error could result in a slight distortion (16 instead of 17), or a huge distortion (1 instead of 17). In fact, if transmitting numbers in this way, the magnitude of distortion varies with the number of digits transmitted. If we transmit a 256 bit number, we could get a massive error due to just one bit being incorrect.

The single erroneous bit will result in the whole USB packet, comprised by many many more bytes to be dropped. Just like that poooffff ans you will hear silence or a pop, nothing else
 
I disagree, I think that data transmission errors are the main source of differences in data cables used to stream audio.

BTW in the cable between HDD and PC, IMHO, you can hardly hear any difference if it is of reasonable quality (i.e. with a really low Bit Error Rate), thanks to error correction.

Clave, where is your proof for the 'differences in data cables' that you refer to? Do you mean measurable differences in specifications, measurable differences in audible output or measured differences in the accuracy of the data.? I've never seen any test prove differences in audibility of digital cables beyond basic working/broken. I'd love to see the source to back up that supposition.

I only know one industry guy who designs and sells a USB dac. He also designed dac and receiver chip parts for Wolfson, so I guess he knows his stuff. He recommends you use a basic certified pc usb cable and any ''studio'' grade spdif cable. That's good enough for me.
 
All this hocus pocus is very entertaining, keep it up - for those engineers with little actual experience trusting the internet keep in mind that everything you read is not necessarily true. If data transmission over a conductive medium was so erroneous, then immediately stop performing electronic banking. Imagine having a digit added or dropped from your account balance.
 
This kind of stuff will actually lower your IQ. The writer is totally clueless, just parroting stuff collected from hifi comic books.

Maybe... but just one question...why the industry (I'm not referring to the audiohile one) bother in testing cables with BER testers, why bother writing whitepapers about testing digital transmission lines?

If whatever cable you use, made in whatever way, doesn't make a difference (I'm not talking of audio here) why industrial cable manifacturers bothers doing different designs?

Some examples:

http://www.onsemi.com/pub_link/Collateral/AND9075-D.PDF
http://www.highfrequencyelectronics.com/Archives/Nov05/HFE1105_Tutorial.pdf
Eye Diagram Basics: Reading and applying eye diagrams | EDN

And remain one fact: most (time critical) digital signals are transmitted without need to be 100% bit perfect, thanks to smart (but not bit perfect) transmission protocols.

In that context the (signal integrity) performance of the cable matters.

It would be interesting to measure for BER cables that are claimed to sound different.
 
You don't have to bother with jitter between the two ends of USB (host A and host B). This is not SPDIF.
Try to learn and understand first how USB data transmission works and data transmission in general.

The USB cable jitter does not affect the stream that is passed between the 2 hosts in a mysterious way. The USB data packet is passed or it is dropped and only after that the initial stream is recomposed and send to DAC chip.
 
Last edited:
The data flowing back and forth inside your computer has to be "bit perfect" to allow us to see the words and the graphs and the pics.
All the visual outputs need that bit perfect data flow. Yes,the PC does error checking and error correction, or resend, to ensure the 100% accuracy that is required.

The same applies to audio bit streams. That is why the comparison tests mentioned in other posts will show either no errors or failed.
Some of the CD rippers will check for 100% accuracy and report when a copy is not 100% accurate.

All that data passes along copper connections.
 
All this hocus pocus is very entertaining, keep it up - for those engineers with little actual experience trusting the internet keep in mind that everything you read is not necessarily true. If data transmission over a conductive medium was so erroneous, then immediately stop performing electronic banking. Imagine having a digit added or dropped from your account balance.

Loads of fun ;) Never even thought this thread would last this long.
One thing people making these cables really understand:
1- electronic banking really works
2- on-line gambling even more
I bet their RJ45 (between the computer and the bank) is made of the same stuff ;)
 
The main reason there is so much information on transmission lines is that a most of todays digital operates at insane frequencies. If digital transmission was that error prone the internet would grind to a halt. On real time operating systems, the type that help control CERN, you have to have accuracy, you cant sit there waiting for the data to struggle through. All that info is so the interface can be engineered to work correctly with absolute or zero data drop out.
On a PC this is less of a problem, as Windows is not real time and has no guaranteed response time, but many other things have and data latency due to errors is not something you want.
As to cables you design or buy the cable that is correct for the job, most of the modern world depends on them these days, maybe they have worked out a lot of the problems, Heaviside in 1878 or so solved a lot of todays transmission problems.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.