USB cable quality

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Digital has come a long way. My first experience was in early 1980s working on even-then antique (1950s?) teletype tecnology. I learned about mark/space, polar, neutral and also not to touch the DC voltages. Technically no error correction but when the circuit dropped the printer "ran open" (ker-chunk but printeth naught). Also was fun to **** off the next station by holding down "bell" and "repeat" DINGETY DINGETY DINGETY DINGETY ETC. In some ways the old tech was better. Ah, for the smell of machine oil and cheap rolls of paper. 5-level BAUDOT anyone?

More apropos to this discussion, in tech school we learned about many possibile types of distortion to the digital signal. In practice, we very rarely encountered these. For the rational, there are ones and zeros. For the audio-fool, there are $200 special cables to make the ones "oner" and the zeroes "zeroer." :)
 
Isochronous USB ... myths, realities...

Most USB audio devices of generations ... use isochronous transfer, where there is no provision for retransmission in the case of errors. Isochronous clocking depends on the host's clock, the data is transmitted willy-nilly, and the DAC must adapt its clock to accommodate the transmitted data rate, usually using a PLL. ...

Umm... not quite.

The isochronous data transmission mode remains a packetized data transfer mode, where whatever native audio digital information rate is buffered, then squirted out at USB2 bitrates, but in an isochronous transmission window whose opening is guaranteed within a microsecond or so.

Further, digging a bit, I find that for audio it remains an uncorrected/unprotected data stream, implementing a derivative of the AES/ADAT bit-coding scheme for data transmission. Therefore, what I said regarding data-drops, loud squarkings and so on, remains true. If you're hearing seemingly un-grossly tarnished audio at all, then you're hearing error-free audio. Since the PCM/digitization technology is completely encapsulated by the much faster bit-packet-stream, by definition, ALL playback devices must employ buffering and down-rate conversion.

But thanks... I get what you're talking about.

GoatGuy
 
Goatguy, by the way is your monika a grown up Goatboy (RE: Bill Hicks, c1993):)
Were not all as dumb as you may think, even those of us who believe (know) that digital is 0s and 1s, and here I consider myself a life member of that camp...why because I spend all day making sure digital signals get from a to b, including signal integrity simulation.
The digital waveform may (will) get distorted to some extent (look at DDR memory waveforms), but the end result is a signal that will be interpited as either a 1 or a 0.
For the uninitiated I suggest getting "Signal Integrity simplified" Eric Bogatin or Howard Johnsons book, some good reading.
The misunderstanding of digital signal transmission in some circles (Audio) is quite amazing, a recent thread comes to mind, when someone pointed out that moving files around your system can add low level distortion to the digital data, the poster stated it as a fact and refused to discuss the matter further (but did quote a couple of audiofile gurus, who had also noticed this... (CRC gives the same results, you can only hear this low level distortion).
Of course if the cable is naff, you will get lots of naff packets and data will have to be re-transmitted.
Digital signals are pretty immune to the distortions that they suffer, the worse one is non-monotonicity on the rising or falling edges.
Have fun all:)
 
Goatguy, by the way is your monika a grown up Goatboy (RE: Bill Hicks, c1993):)
Were not all as dumb as you may think, even those of us who believe (know) that digital is 0s and 1s, and here I consider myself a life member of that camp...why because I spend all day making sure digital signals get from a to b, including signal integrity simulation.
The digital waveform may (will) get distorted to some extent (look at DDR memory waveforms), but the end result is a signal that will be interpited as either a 1 or a 0.
For the uninitiated I suggest getting "Signal Integrity simplified" Eric Bogatin or Howard Johnsons book, some good reading.
The misunderstanding of digital signal transmission in some circles (Audio) is quite amazing, a recent thread comes to mind, when someone pointed out that moving files around your system can add low level distortion to the digital data, the poster stated it as a fact and refused to discuss the matter further (but did quote a couple of audiofile gurus, who had also noticed this... (CRC gives the same results, you can only hear this low level distortion).
Of course if the cable is naff, you will get lots of naff packets and data will have to be re-transmitted.
Digital signals are pretty immune to the distortions that they suffer, the worse one is non-monotonicity on the rising or falling edges.
Have fun all:)

We seem to agree. "GoatGuy" is from generally being a technical billy-goat - in that I like to take the opposite side of most arguments (unless it is clearly just wrong). Yes, I remember Goatboy.

Non-monotonicity in rising/falling edges in the digital stream is one of the most significant factors that introduces phantom bits into a NRZI encoded bitstream. "wise" decoder (demux) devices though don't trigger on the edge, but capture the prevailing J/K bit state of the PREVIOUS transition at the edge. This is far less susceptible (at least 2 and some say up to 4 orders of magnitude) to nonmonotonicity than rising edge issues.

GoatGuy
 
Yep, but for some reason I believe more in cables.

And I too... nominally.

There is something particularly sexy though about having a wireless fabric over which to move the digital sound data.

One can move their "iDevice" about, keep it in pocket, not needing to even connect it when coming back from a long day trip. Sure a cable works. But if the wireless approach actually WORKS (i.e. through encoding and other means, achieves a near-zero bit-error-rate), and if one is committed to spend "in the hundreds" for a fine (or I say, too fine) cable... then why not?

GoatGuy
 
:Done of; if not my ultimate favorite comedy sketch! still love the manic comedy of Mr Hicks and Mr Leary etc, a lot of comedys so tame these days.
Yes form what I've seen of your posts so far I would say we agree, I just word it different, I would say I use quiet frustration in my replies.
a bit of boring SI (Signal Integrity) since we are on about digital. Most non monotonic signals I see using SI are on multi source parallel recievers with a single driver (some clocks). This is solved by some quite often complex termination schemes often involving a series termination resistor to slow the initial wave into the single trace from the driver pin, with AC parallel termination at each receiver to match the lines, and the overall impedance seen by the driver chip. Also generaly the driver is a bespoke driver chip, and these do seem to have a agressive current drive, to drive the multi sources, think JTL05 (JTAG idriver) and such like.
Regarding signal transmision and cables, I have to admit they DO make a difference, we once had to create a digital interface that would operate a relay box, up to a kilometre away! normal bell wire would work up to 200m, beldon twisted pair shielded cable (£5000+ for a km) worked to 1km, 40X the cost of the relay box. So personaly I wouln't worry to much about a metre or so of USB.
And remember its just a ride;)
Marc
 
This is a side of being an 'audiophile' that I absolutely detest these days. A bog standard usb cable was/is perfectly fine, and capable of allowing any number of documents to be printed without a single letter out of place. Or transmitting high definition video without any glitches. Or incredibly detailed photos, again without any visible glitches. However as soon as its audio related, NO it suddenly cannot be good enough unless it has been branded by one of the greedy audio manufacturers like Kimber or Audioquest! It's ridiculous - it's almost as if audio has the magical ability to suddenly ignore the laws of physics!!! No matter what accessory it is, it automatically cannot be good for audio use (despite being fine anywhere else) unless some audio company has their branding upon it! I get the impression many seriously believe this too, and thus a whole sub-culture of snake-oil accessories has sprung into being over the past couple decades. No you can't measure it folks, but if you have good hearing & your system is good enough you can ;) Therein lies the root of the problem - advertisers were cleverly able to exploit this 'audio snob paranoia' and when you dish out the same BS year after year after year and the magazine writers concur it eventually gets accepted as truth without being questioned by the mainstream.

Perhaps people need to wake up to the fact that proper designers who KNOW their field have already done the design work/worrying for us! Belkin for example. A Belkin USB cable will be as good as anything out there. Why? It's been designed by people who know their jobs. Stop worrying about it - move on! Not some audio company who rely on tooting cryo freezing, thick impressive looking connectors and fancy jackets instead...

I'd love for some audio magazine to grow a set of balls and denounce all this rubbish, but they won't as they have to rely on advertising of course and live in the pockets of sponsors and manufacturers. I'm embarrassed to say I believed it too for a few years, before I got into DIY and could actually see & hear the results by building my own equipment.

Always find it somewhat amusing that many 'audiophile' favourite albums (i.e. Mercury Living Presence, RCA, Blue Note etc) would have been recorded without ANY wanky cables at all, but bog standard tin-plated copper - The Horror! ;)

John
 
Last edited:
And [phofman], absolutely none of that "noise" matters one iota. Not one scintilla. Not one sesame seed in a bucket of hummus. Why?

Because its digital... is why! Digital was designed to have TWO states, which we say "1" and "0", or true/false. As was so eloquently said by [johnm] above, millions of images, having millions of bytes (i.e. terabytes) of information can, and is transmitted over a bog-standard USB cable, flawlessly ... over the years of its service. Heck... in our rack of ultra-high-end database processing servers, ALL the disks are external, and all are hooked in via bog-standard USB cables. These transmit and receive terabytes per cable per day ... and MUST do so flawlessly. If they don't, if even on bit is flawed ... the error is detected, corrected in microseconds, and it gets logged in the server data-exception logger. I've checked that logger, like religion once a week or so, for 4 years. Not one unintentional data bit lost.

To jigger the system, I actually took an old USB cable and pounded on it on an anvil with a hammer - enough to expose the wires, and be "horrible". Then I dipped it in salt water, just to add more degradation to things. Had it in a little bucket of salt-water. After trimming away the bits of wire that were shorting it out, I hooked it up to see if the logger would log anything. [I mean, if it reports nothing and always has, then it could be deaf to the problem!] Sure enough, the logger immediately started to complain about lost bits and retransmitted packets. Yet, 99% of the packets magically got through. And, at the other end (the disk), when we finally disconnected it a few days later, hooked up a proper cable, and then READ BACK all the data that had been recorded... guess what. ZERO data errors. None. That absolutely shitty cable and its data atrocities ... didn't affect what had been recorded at the other end at all.

So... let's just go back to basics on the USB cable front. Buy decent quality, not bargain basement, and especially not "audiophile grade" cables. Belden makes great wire, if you're building your own, and Belkin makes great cables.

GoatGuy
 
Because its digital... is why! Digital was designed to have TWO states, which we say "1" and "0", or true/false. As was so eloquently said by [johnm] above, millions of images, having millions of bytes (i.e. terabytes) of information can, and is transmitted over a bog-standard USB cable, flawlessly ... over the years of its service. Heck... in our rack of ultra-high-end database processing servers, ALL the disks are external, and all are hooked in via bog-standard USB cables. These transmit and receive terabytes per cable per day ... and MUST do so flawlessly. If they don't, if even on bit is flawed ... the error is detected, corrected in microseconds, and it gets logged in the server data-exception logger. I've checked that logger, like religion once a week or so, for 4 years. Not one unintentional data bit lost.
GoatGuy

Did you notice I did not talk about noise induced into the cable, but about noise originating in the PC itself? That kind of means I was giving reasons why the cable itself is not that important :)

Those who know me from this and other forums can confirm I am very oriented on technical and hard science facts. If someone claims to hear something, I challenge him to confirm his assumption with a blind test (almost always with very negative response, to tell the truth :) ).

Yet I would not say USB audio is about zeros and ones. The noise coming over from the PC (even if the DAC has its independent power supply) simply affects the analog circuits, there is no way it would not. Unless great care is taken by the manufacturer. Is it audible? It certainly depends, a blind test would reveal the truth. It is perfectly possible many people can hear the difference as there are sound technical reasons behind it. Will they take the test? Well, ... :)

But again, the above is not about cables :)
 
IMO most noise into USB comes from...

Dunno. I interpret those words as you saying "most noise into USB comes from...". You din't say "into the sound card", or "into the output jack" or "into the USB DAC on the preamp/amplifier chain"... how was I supposed to know that you were referring to the more abstract "noise in the audio system(s) inside a PC?" And you know, I would say that "into USB" very particularly means "into the USB port, into the USB jack, into the USB cable" in combination.

But in any case, thank you for clarifying your position.

I appreciate (deeply) that you're the kind of bloke that insists not just on A/B tests, but "blind" A/B tests.

I hooked up a cool circuit a few years ago that made it utterly random... using two boxes. One was the supposed "a/b/c" switch, which would assign to the three positions one of the combinations at random: ABB BAB BBA AAB ABA BAA ... and would keep that random setting until the RESET button was pressed (3 times, to protect the results!). There were also 3 buttons to record "same", "worse than previous" and "better then previous".

A nice voting system, really.

You would choose the A, B or C hookup (not knowing whether any of them were really A or B), in any order, over and over, voting each time. (Required vote!) A little ARDUINO circuit kept tally of channel selects, votes, switch settings, lit up LEDs and so on. At the end, the "result" button would show the sum of "is better" minus "is worse" for the A and B test signals.

At the remote end, a small project box had a set of jacks allowing for the A/B selection and pass-through. Used reed relays to engage things. It also allowed "10 things to be switched" (since many A/B tests require changing more than just between two signal paths.). 2 of the inputs also had high quality attenuators on them, to allow for setting normalized signal levels.

What we found: when there was a big difference, within about 10 votes or so, the trend was clear. When the difference was small, it would take about 35 votes to get a clear trend in preference. When the difference was really, really subtle ... over 100 votes. And when there was intentionally no difference at all (if AAA or BBB was allowed!) then the DIFFERENCE between preferred channels was so small at 100, that it suggested "a push" (in other words, the system worked as intended)

By design, the box would never display WHICH of the buttons (A, B, C) was assigned to which channel setting (A, B). Therefore, there was no way to "cheat" - even for me, the inventor. No telltales, no tiny light glitches to use to cheat.

And you're right - its not about cables.

GoatGuy
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.