Does making distortion measurement of cable make sense?

Status
Not open for further replies.
AX tech editor
Joined 2002
Paid Member
Okay then let's please be more realistic in our modeling, at least if we are going to quibble about what is physically possible and what isn't.

A start is the realization that we are after distortion of a piece of cable. Normally you measure distortion by applying a pure signal to the DUT input and measure at the output, the end of the cable.
With just a cable that's not realistic because with the far end open, there's no load current and no distortion.

So we need to get some current through the cable that develops an output voltage that we can measure -> a load. But if you want to measure the cable and not the load impact, that load needs to be as linear as possible, and certainly not a speaker!

The other thing you need is a pure input signal, so for that you need a very low distortion, high performance power amp.

As in all serious tests, you need to establish a baseline with the DUT absent, so that would be the power amp directly connected to the linear load and measured for the baseline. Then you insert the cable between amp and load and measure again.
If different, you know it's the cable you are measuring.
Not very hard.

Jan
 
A start is the realization that we are after distortion of a piece of cable.

You mean HD? Why bother don't we already know that is usually a connector problem, if it occurs at all.

The question most will face if they care to check on it is how and why does 8' of speaker cable result in a different sound as compared to 6" of Romex, and why do all the 8' speaker cable designs result in different sound when compared to each other?

We already have at least one speaker cable thread in the forum which looked into the issue scientifically (started out looking at ITD, IIRC). Measured cable inductance at low audio frequencies is not accurately predicted by the standard cable geometry formulas. IIRC jneutron explained why in great detail back in one of the Blowtorch threads. For speaker cable, compensating zip cord with lumped reactances can help a lot. There is more to it than just that though as some proprietary research has shown.
 
Last edited:
I use optical phono along with electrostat speakers. The effects are plainly audible under the circumstances here. Maybe not audible as much to other people and or under other circumstances. You are always welcome to visit and see for yourself if you ever get the chance.
The proper word is "perceivable". It requires verification to confirm the audibility which you didn't do. Until then, it may all be in your head or simply an effect of listening position change which can cause audible difference without changing cables or DACs.
 
With " junk" cable, speaker distortion would be 1.006 % THD
and a high end cable would be ....................... 1.000% THD

Basically the end, you cannot hear it.

But obviously people can hear the differences.

Anyway, for those who demand measurements, I have a simple question:
How do you measure soundstage width, height, depth?
 
Last edited:
This works? if not, tell us ......
 

Attachments

  • cinta-metrica-bremen-5-mts.jpg
    cinta-metrica-bremen-5-mts.jpg
    40.6 KB · Views: 154
Skin depth is relevant to high frequencies such as radio frequencies over 1MHz.. It is not relevant for <100kHz unless you have a load of EMI noise on there.. which you should really use differential if the noise is so bad it's going through the single ended cable shielding.. Your audio amp should not be wasting power amplifying out of the audio range.. so this is a load of baloney.

If you're using the cable for 512 DSD at 24MHz then you can worry about skin but even then it's low enough to not cause problems like GHz..

I looked up the maths on this to further my understanding:
a) proportional to frequency and current
b) a 10KHz signal at 1 amp has a 2mm skin on a copper cable.

Now that's ignoring harmonics, given the maths, upper harmonics are therefore going to ride closer to the surface of the conductor skin thus harmonic energy causes more current at the surface layers thus having a better conductor on the surface would make a difference - IF - it's high current and in the upper registers or square waves if you can distinguish.

Sorry to the experienced and educated individuals ;) I'm slowly learning..
 
With " junk" cable, speaker distortion would be 1.006 % THD
and a high end cable would be ....................... 1.000% THD

Basically the end, you cannot hear it.
The threshold for audibility of non-linear distortion is more than 100 times less than the example given; it would be extraordinary if it wasn't audible!
What example?
I'd say you're a bit too confident of your own ability.
But obviously people can hear the differences.
What made it obvious?
 
There are two ways to look at this:

1. Cables are passive devices and therefore linear devices which means they can be categorized as "LTI" or Linear Time Invariant. Therefore they cannot cause distortion. Basically you put in a sinewave, you get out a sinewave no more no less. So most cables would have very low distortion and certainly below human hearing threshold.

2. Or you can say cables are basically an LCR network. Inductors and capacitors are not linear if driven hard enough. The dielectric materials are not linear in the strictest sense. Again if driven hard enough, they can be non-linear. And if cables are non-linear then they can produce distortion like non-linear devices such as transistors.

But regardless, even if cables are not linear in the strictest sense, I would assume if you try to measure distortion of a cable, the level is very low for most cables.
A non-linear current flows through the speaker cable due to the fact that the load has non-linear resistance. Thus, even if the resistance of the cable is linear, the voltage drop across the cable will be non-linear, and this will add additional distortion at the speaker terminals. If the resistance of the cable is non-linear, such as due to skin effect, etc., the situation worsens even more.
 
...
Assuming most of us listen to direct radiator speaker types.
Speaker distortion is already at 1 to 3% distortion
This idea that a THD measurement correlates well to audible distortion is one of the worst misperceptions in audio.

3% THD can be inaudible if it's all 2nd to 4th harmonics, as it usually is in dynamic loudspeakers, or it can be awful sounding if it's from a lot of high harmonics, as in a bad case of solid-state amplifier crossover distortion. The cassette tape hiss example is also good in showing how it can be audible. This is due to the ear's masking effect of the fundamental on the lower harmonics, making them harder to hear.
https://en.wikipedia.org/wiki/Auditory_masking

Check out the GedLee metric compared to THD.

I actually agree that different speaker cables with decent sized conductors and solid connections have no significant sound differences, but any possible difference has nothing to do with THD measurements. A well-connected cable has THD of basically zero.
 
AX tech editor
Joined 2002
Paid Member
A non-linear current flows through the speaker cable due to the fact that the load has non-linear resistance. Thus, even if the resistance of the cable is linear, the voltage drop across the cable will be non-linear, and this will add additional distortion at the speaker terminals.

Agreed, that is the reason you can only measure true cable distortion with a linear non-distorting load.

If the resistance of the cable is non-linear, such as due to skin effect, etc., the situation worsens even more.

I don't think skin effect can cause non-linear harmonic distortion. But it can cause linear frequency response/phase response distortion.

Jan
 
Status
Not open for further replies.