John Curl's Blowtorch preamplifier part III

Status
Not open for further replies.
If I were to come across a friend (well, ok I'm stretching it😀 ) who found that his stereo imaging was unstable, and he was using zip cord, I would do several things..
1. Connect both speakers to one channel, does the image now remain central and fixed? If so, the speakers are ok.
2. Return to both channels, use a mono signal. Is the image still central and fixed? If so, the amplifiers and speakers are both ok and both work well together.
3. Flip to stereo and confirm the imaging is messing up, and determine if it is volume dependent.

If a speaker is by design, acting in a non linear and/or time varient way, the next thing to try is modifying the cables.

I would double up the zips, halving the characteristic impedance of the cable driving the load. Anybody who doesn't like the terminology, instead of halving the cable impedance....double the capacitance and halve the inductance by paralleling a second zip across each cable.

If that makes a change, double up again and listen.

In my world, you are now running the loads off a cable that has a characteristic impedance quite close to that of the load middle, 37 ohms give or take. For those who do not like me, think of it as we have quadrupled the cable capacitance and quartered the inductance.

While both are equivalent, and the results the same, those who do not understand T-lines well will be happy with the C*4, L/4 explanation.

Both will model the same. What you have done is reduce the settling time variance (or LCR time constant variance) caused by the load and it's time dependent impedance variation.

jn
 
Last edited:
So, is this a fair summary in the Pavel / JN discourse, for want of a better name... (BTW - it's a great discussion - real technical aspects being discussed, not the usual Bybee / Goop / audiogonzo stuff, by good engineers):

- There is a measurable factor, which is that there can be a variation in the signal delay along a reasonable cable length, which can be simulated when the line is considered as a transmission line, and can be measured.

- The key factor is, can this be heard? That seems to be the point of potential disagreement?

And, what cable length reduces this to irrelevance?
My amp is currently set between the speakers, cables both approx 1.2 meters..
Once the triamp setup is built (don't ask how long that's been "nearly ready") that will be irrelevant of course... 🙂
 
Last edited:
Since the only way it can be heard is if the delay is different on each channel, simple ensure it isn't
I believe JN is arguing that a loudspeaker load is sufficiently non-linear to be level and driver diaphragm position dependent, so that the different signals to stereo speakers are potentially enough to cause something audible.

All the discussion about transmission lines, etc. is, to this view, only because they're the "top" of the voltage divider, so could potentially minimize the timing issues by mindful choices.

All good fortune,
Chris
 
Last edited:
Twisted pair cables are generally something like 100R characteristic impedance. I find 100R termination resistor at each end of the speaker cables makes for cleaner sound, same with 75R for coax used as speaker cables.

Just sayin'.
Dan.
Assuming this is "valid for all frequencies [of interest]" I have to wonder about what's the effective resistance when paralleled with the amplifier output (maybe 0.001 to 500 ohms, DC to 10MHz) at one end, and the speaker (perhaps 2 to 500 over the same frequency range), and how that affects the 100 ohms that "terminates" the cable.

Limiting the frequencies of interest to the audio range, the impedance already connected on each end is well below 100 ohms, so all the resistor does is dissipate a small amount of power.

But if you hear an improvement, it must make for perfect termination.
 
So, is this a fair summary in the Pavel / JN discourse, for want of a better name... (BTW - it's a great discussion - real technical aspects being discussed, not the usual Bybee / Goop / audiogonzo stuff, by good engineers):

- There is a measurable factor, which is that there can be a variation in the signal delay along a reasonable cable length, which can be simulated when the line is considered as a transmission line, and can be measured.

- The key factor is, can this be heard? That seems to be the point of potential disagreement?

And, what cable length reduces this to irrelevance?
My amp is currently set between the speakers, cables both approx 1.2 meters..
Once the triamp setup is built (don't ask how long that's been "nearly ready") that will be irrelevant of course... 🙂

- There is a measurable factor, which is that there can be a variation in the signal delay caused by a reasonable cable length, which can be simulated when the line is considered as a transmission line, and can be measured.

- The key factor is, can this be heard? That seems to be the point of potential disagreement? While it is a relevant question, the discussion has yet to proceed to the point where that question is being raised.
And, what cable length reduces this to irrelevance?
For me, any length or cable type that reduces the system settling time to below 1 uSec for the full load impedance variation is sufficient. Also, only within the band of audio frequencies where we use ITD for localization. If we exceed 1 uSec ITD at 20 hz, who cares, we can't detect that anyway.

jn
 
My takeaway is that the resistance is because they were taught in school that the t-line approach is useless when the line is << wavelength within the line.

jn
Given that when I learned T-line theory cable goofyness was in full swing and I was an impressionable type it took me a long time to stop worrying about that and start worrying about the important things.

Hands up anyone who, after a few beers has made interconnects out of solder!
 
- The key factor is, can this be heard? That seems to be the point of potential disagreement? While it is a relevant question, the discussion has yet to proceed to the point where that question is being raised.
And, what cable length reduces this to irrelevance?
For me, any length or cable type that reduces the system settling time to below 1 uSec for the full load impedance variation is sufficient. Also, only within the band of audio frequencies where we use ITD for localization. If we exceed 1 uSec ITD at 20 hz, who cares, we can't detect that anyway.

40 years ago settling time was more of an issue with low characteristic
impedance cables such as Polk Cobra Cable and Mogami Wire, where
unterminated cable lengths resonated somewhat above 1 MHz, driving
the smoke out of various wide-bandwidth/high slew rate amplifiers.

I think 0.25 uS was pretty safe...
 
Given that when I learned T-line theory cable goofyness was in full swing and I was an impressionable type it took me a long time to stop worrying about that and start worrying about the important things.
For some, it never stops.
It was intended as a thought experiment only, in case that wasn't obvious.
Have you experimented with more important things in audio replaying electronic system like speakers and room acoustics?
 
Based on what I said earlier that ITD is really only of importance below 1KHz (for most people) then is it not the characteristic impedance of the cable in the frequency range from 1Khz down that we need to be concerned with & try to bring nearer to the loudspeaker impedance?

Looking at Bateman's paper "How “Electrically Long” is a 4.9metre long Speaker Cable ?" where he measured cable impedance for various cables at various frequencies we don't see too much of an impedance range between 100Hz & 1KHz
 
40 years ago settling time was more of an issue with low characteristic
impedance cables such as Polk Cobra Cable and Mogami Wire, where
unterminated cable lengths resonated somewhat above 1 MHz, driving
the smoke out of various wide-bandwidth/high slew rate amplifiers.

I think 0.25 uS was pretty safe...
When they first started thinking about speaker and t-line, it seemed the drive was to bring the cable down to speaker impedance. Unfortunately, it required inductances in the 10 nanohenries per foot arena, but worst, capacitance had to be in the hundreds of picofarads per foot, 300 plus IIRC. All was great as long as the cable saw nominal match at the load up to unity gain. Unloaded, all the amp saw was capacitance, lots of it. No zobel...no amp.

I figure 4 parallel zips any amp on the planet can handle.

Jn
 
Based on what I said earlier that ITD is really only of importance below 1KHz (for most people) then is it not the characteristic impedance of the cable in the frequency range from 1Khz down that we need to be concerned with & try to bring nearer to the loudspeaker impedance?

Yes, I believe that would be correct. While I'm not sure about the 1K number, I'm sure it's not hard to get good enough for all humans using say, 35-50 ohm cables, or three to four parallel runs per speaker. That even allows going up a gauge as you do have more copper.
I'd try to keep ITD errors down up to 10khz give or take. I always like to stay at least an order of magnitude better than spec, three if possible.

It's fun giving a presentation where the questioners are concerned about you even making target...and then your next slide shows measured performance three orders of magnitude better. Classic deer in the headlights..😀

Jn
 
We are talking about a load (the speaker + crossover system) that is introducing severe phase angle changes within the audio band - as would be fully expected - NOT the cable.

The cable introduces no TLE effects at audio and the lumped cable RLC is totally swamped by the speaker load.

Please separate these things.
 
Status
Not open for further replies.