Does making distortion measurement of cable make sense?

Status
Not open for further replies.
There are two ways to look at this:

1. Cables are passive devices and therefore linear devices which means they can be categorized as "LTI" or Linear Time Invariant. Therefore they cannot cause distortion. Basically you put in a sinewave, you get out a sinewave no more no less. So most cables would have very low distortion and certainly below human hearing threshold.

2. Or you can say cables are basically an LCR network. Inductors and capacitors are not linear if driven hard enough. The dielectric materials are not linear in the strictest sense. Again if driven hard enough, they can be non-linear. And if cables are non-linear then they can produce distortion like non-linear devices such as transistors.

But regardless, even if cables are not linear in the strictest sense, I would assume if you try to measure distortion of a cable, the level is very low for most cables.
 
Last edited:
I think if someone believes there is an audible difference with certain cables, they could use a short Y-cable adaptor with, say a voltage divider of a couple hundred thousand Ohms, and place this on the output of the next device in line and record the results from the cable swaps upstream. They could be compared instantly afterwards in a program like Adobe Audition, etc. I’m surprised cable companies don’t do this validate their claims.
 
Or alternatively that they knew full well that they were pedaling pseudoscientific snake oil and didn’t care - that they sincerely believed that objective science could be circumvented to suit the narrative purple prose of their white-papers and ad copy, or that they didn’t have a freaking clue.
Not sure which should be more infuriating.
Full disclosure - fell for some of this claptrap myself in the mid to late ‘80s early ‘90s. Fortunately the financial restraints of raising two kids on only a middle class joint income at the time prevented me from becoming fully engulfed in the miasmic rabbit hole that is “high end audio tweak porn”
 
2. Or you can say cables are basically an LCR network. Inductors and capacitors are not linear if driven hard enough. The dielectric materials are not linear in the strictest sense. Again if driven hard enough, they can be non-linear. And if cables are non-linear then they can produce distortion like non-linear devices such as transistors.

Mmmmmmmmmmmyes...
But if you've managed to get the cable overheated you're doing it wrong ;) :D
Correct cable thickness is important to reduce potential losses and to maintain specific safety guidelines.

Things happen.
I managed to evaporate (and all the insulation went *poof*) part of a 1m long 2,5mm thick solid core well insulated copper inductor on a battery bank once (it was stubborn enough to move to the wrong place as I was fidgeting with my tools), managed to save the batteries.
A former colleague managed to melt a wrench, also on a battery bank, battery did not survive.
 
There are two ways to look at this:

1. Cables are passive devices and therefore linear devices...

2. Or you can say cables are basically an LCR network. Inductors and capacitors are not linear if driven hard enough...

This is awkward. First you seem to say that passive devices are linear. Then you seem to say the inductors and capacitors are not linear, although they are passive. Why do you think that passive devices must be linear?
 
Inductors and capacitors are not linear if driven hard enough.

What does "driven hard enough" mean? The parasitic capacitance of a cable sees at most the full voltage an amp is capable of developing. Which is what, a 100v? Coupling caps in tube amps see much higher voltages and have a lot more capacitance, do you expect them to distort like crazy?

As for the parasitic inductance, what possible mechanism can produce distortion in an air core inductor?

There will be some modulation in the cable resistance at high currents but that is all i see.
 
Some people heard something about distortion, and they are concerned about cable distortion. But at the same time they are not so concerned about the distortion of the component at the end of same cable, i.e. loudspeaker distortion, which is about one million times higher. (Same for interconnects)
 
Harmonic distortion has very little to do with the way we perceive sound unless it is very significant. It is of course absurd to attribute wire sound to harmonic distortion.

The old DIN45500 was wise to set the border between lofi and hifi at 1%. Raising it an order of magnitude today seems about apt.
 
If you spend enough, you can get a perfect cable.
cable.jpg
https://twitter.com/pickover/status/1403504838055432195
 
Status
Not open for further replies.