Speaker cables don't influence harmonic distortion!

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
BesPav said:
The nonlinearity are caused by wire resistance.
No. You disagree with me (and physics).

Of course - no.
You now agree with me (and physics). So what do you think? You can't agree with something and its exact opposite.

Nonlinearity of speaker impedance doesn’t cause signal distortion. It can cause phase lag, it can cause huge current, but not distortion.
No. It is the nonlinearity of the speaker impedance which is being measured. Nonlinearity causes distortion.

Our measurement setup sees part of voltage drop, being caused by that current at wire resistance (or being divided at load voltage) as distortion.
This is precisely what I have been trying to teach you.

Distortion was caused by bad damping.
No. Distortion is created by nonlinearity (in the speaker). This distortion is then rendered visible by resistance in the circuit (which will also affect the damping too). Poor damping does not cause nonlinearity and so cannot cause nonlinearity.

xrk971 said:
Pure conducting ideal wires are different than real cables which have intrinsic parasitic capacitance and inductance.
Cable capacitance and inductance is small (for speaker cables) and fairly linear, so not a source of distortion.

They also have metal to metal terminations of dissimilar metals and joints etc. I don’t think all of these system effects are linear.
Good joints are linear, otherwise most modern technology could not work.

You seem to want to believe that cables cause distortion. Why? Science says no. Technology says no. Careful measurements say no. Why choose to believe the opposite of the truth?
 
There are people who advocate going to the opposite extreme: current driving of speakers. Their logic is that it is current which moves the voice coil, so if the current is set by the amplifier (instead of the voltage) then the result will be more transducer linearity.
There is a good reason for which at high displacements of the voice coil, this can be true. Many tests have shown that some drivers distort less when current driven (with the drawback that electrical damping is lost and the frequency response needs compensation) and other drivers distort less when voltage driven. If somebody thinks of current driving, he must first check the distortion on each mode.
I am not convinced that they are right. Speaker nonlinearity has various causes but there is no a priori reason to assume that current drive is better. For one cause of nonlinearity (voice coil displacement in a non-uniform magnetic field) I think voltage drive is better: as the voice coil moves out of the gap it will generate less force but also less back emf so with a conventional system the current will increase to partly compensate; a current-driven system cannot compensate. Suspension nonlinearity could act in the same way: if anything tends to restrict the motion this will increase the current in a conventional system, but not a current-driven system.
On large displacements, the B of the Bl decreases. Then the back-EMF falls, the impedance decreases. In voltage driven mode, there is more current in the voice coil and the moving force tends to push it outside of the gap. Albeit rarely discussed, this effect is known since long, it is called driver instability. Avoiding it is the main advantage of driver current driving.
That is one of the beauties of audio: whatever the physics says, there are always people who take the opposite view.
You see above that how physics can be subtle. Note that some great names have investigated driver behavior in current mode.
 
The results are rather interesting.
  • Distortion increase is significantly more noticeable at 1KHz than it is at 20KHz;
  • Harmonic that grows the most is H3 (H2 even goes down in some cases);
  • The effect is much less noticeable with low-feedback amplifier than it is with "normal" - high-feedback - amplifier. Distortion at 20KHz even goes down with longer cable as soon as the feedback loop gain is low (18db in my case).

Hi Valeriy,

I was not courageous enough to read the whole thread but let me bring my 12 cents:

- I hope you are measuring with a differential input connected to the load to avoid especially signal return wire influence,
- in case of considerable impedance of the wires that go to the load, they may decrease damping factor of the amplifier and then distortion rises, because speaker non-linearity, reflected as non-linear speaker current, is more affecting the resulting distortion, please see
Current drive of speakers and speaker distortion

Cables itself have no distortion, it is all about conditions in the complete circuit.

Similar thing happens if you connect headphones to headphone amplifier with higher output impedance, like 50 ohm vs. amp with something like 1 ohm output impedance. You will measure higher distortion in case of the 50 ohm output, because headphone driver non-linear current results in higher voltage distortion across higher impedance. In case of lower output impedance the damping is better and measured voltage distortion is lower.
 
Last edited:
I was not courageous enough to read the whole thread

Hi Pavel,

Those were the "strange" results we started at. Later on, I have identified the problem with the analyzer's input connection - described here:
Correct measurements at the far end of the cable

Finally, I ran a series of tests with the real speaker at the low frequencies, demonstrating non-linearity of the speaker impedance and the fact that the cable with higher Z allows more noticeable distortion at the speaker end (which is practically a divider exercise):
Real speaker spectrums measurement

These measurements are actually in line with the ones presented in your article.

Cheers,
Valery
 
I would like to propose that, if it were possible, someone would measure a system of this type:

6moons audioreviews: Mr. Fussball’s budget system

The amplifier has no feedback, so there is no influence out there, and the speakers are full-range without crossovers, there's no influence out there either.

But the secondary of the output transformer, connected directly to the speaker coil forms a balanced system, and that's the interesting part. Does this reduce the noise and distortion in relation to a 'normal' connection (feedback + crossovers) or does it have no influence? If it produces any improvement this would be a very interesting type of system.

Thanks in advance
 
All amps have feedback. And all speakers are driven balanced. If you grounded one side of the output transformer it would still be balanced. You would need to ground the negative speaker terminal to a 3rd pin power ground to unbalance it.
 
Last edited:
BesPav said:
Read carefully. The observed distortion was caused by interacting between speaker nonlinearity and cable resistance, while latter also makes this visible.
No. You still seem to be asserting that the cable, which you admit is linear, has a part to play in causing the distortion. You are arguing against yourself.

forr said:
You see above that how physics can be subtle. Note that some great names have investigated driver behavior in current mode.
Thanks for your explanations.
 
the cable, which you admit is linear, has a part to play in causing the distortion.
Yes, you’re clearly right now.
Cable is linear and it’s a single reason to distortion. It’s resistance hinders secondary current from driver’s coil to be shorted/dumped by/to the low output resistanse of the amplifier.

While speaker is connected to the amp without long/resistive cable there are no distortion.

Open your eyes, exactly cable resistanse being linear nevertheless simultaneously causes distortion and makes it’s visible.

Boring now, thank you for our conversation.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.