preamps, ss versus tubes

Status
Not open for further replies.
But the deviation of the signal is a function of the source vs. load impedance. Not how much current you may have flowing at the time.

...

What distinguishes the interconnect example from the speaker cable example isn't "current." It's the difference in source/load impedance ratios.

se

That's another way to look at it and certainly a useful one as well. But the load impedance is what determines the current through the divider. The divider divides because of IR drop through the cable as well as the load. When you ratio it out, the source voltage drops out, for sure, but the whole reason the ratio is lower on the speaker end is because of current. High current -> low resistance. In V = IR, none of the variables are privileged. The views are absolutely equivalent.
 
That's another way to look at it and certainly a useful one as well. But the load impedance is what determines the current through the divider.

No, the whole end-to-end impedance of the divider determines the current through it. Including the output impedance of the source which you left out in both examples.

The divider divides because of IR drop through the cable as well as the load. When you ratio it out, the source voltage drops out, for sure, but the whole reason the ratio is lower on the speaker end is because of current.

Only in the sense that you have to have current to have a voltage drop across an impedance.

High current -> low resistance. In V = IR, none of the variables are privileged. The views are absolutely equivalent.

But I = V/R, so high current may just as well mean high voltage as low resistance.

The only thing fixed in the two examples is the ratios of the upper and lower impedances of the voltage dividers and it's those ratios that determine the signal deviations regardless of how much current is flowing through the divider as you can use any arbitrary source voltage you like, giving any arbitrary current and you'll end up with the same deviation.

To explain the differences in terms of "current" is perhaps the most abstruse way of doing it I can think of.

se
 
Simple.

Current.

😀

se
I was hoping that we could keep this thread on-topic, which is about preamps, not amps. But, extra current is not an advantage of transistors all the time. What you are talking about is the amp as a voltage source, and that model is not universal.

It is true that the industry tried to go to that model to reduce speaker frequency response variables, but the problem is that for some reason in the last half century the 'prior art' (tubes) failed to go away like 'obsolete' technologies usually do. This is because there is more than flat frequency response on an acoustic suspension speaker in the world of audio- for example the same kind of amp won't get you flat frequency response on horns or ESLs.
 
What distinguishes the interconnect example from the speaker cable example isn't "current." It's the difference in source/load impedance ratios.

se

Huh?? If you are running a 600 ohm balanced line, trust me there is a measurable current. Even though there **is** a characteristic impedance of interconnect cables, we usually ignore it in most interconnects. In the balanced line standard, the 600 ohm termination standard arose out of the spacing of lines in free air. Off telephone poles.

But the fact of the matter is that if you terminate a balanced line at 600 ohms, you can usually run it for some miles before the artifact of the cable becomes apparent. That is in fact the raison d'etre of the balanced line system: to eliminate interconnect cable artifact.
 
Huh?? If you are running a 600 ohm balanced line, trust me there is a measurable current.

I didn't say there was no current. I said it wasn't current that distinguished the interconnect example from the speaker cable example.

Even though there **is** a characteristic impedance of interconnect cables, we usually ignore it in most interconnects.

Not at audio frequencies when line lengths are what you find in a typical audio system. They simply don't behave like transmission lines so we just use the lumped parameters of resistance, inductance and capacitance.

In the balanced line standard, the 600 ohm termination standard arose out of the spacing of lines in free air. Off telephone poles.

Yes, which could run for many miles where they did begin to behave like transmission lines at audio frequencies.

But the fact of the matter is that if you terminate a balanced line at 600 ohms, you can usually run it for some miles before the artifact of the cable becomes apparent. That is in fact the raison d'etre of the balanced line system: to eliminate interconnect cable artifact.

That's all irrelevant unless your cables are thousands of feet long.

se
 
That's all irrelevant unless your cables are thousands of feet long.

se
No, quite the contrary. I've proven many times over the last 20 years since we built our first preamp that could support the standard that it makes a huge difference in interconnect cables. Its like I said way back in earlier parts of this thread: if you can hear differences in your cable, then its not being driven by a preamp that supports the standard.

I've put dirt-cheap ($85.00) balanced cables up against $1000/foot balanced interconnect, and once the 600 ohm termination was in place there was no audible difference between the cables. In that case the owner was pretty happy- he sold is 24' interconnects for a fair chunk of change.

You can hear this difference in one meter interconnects. The big issue, also as I mentioned earlier, is that most preamps' designers do not even know that one of the functions of the line stage is to control the interconnect (BTW this is why in passive systems the cable is so much more critical than it is with an active line stage- this too was covered earlier).

In a nutshell the current needed to drive the 600 ohm load swamps out capacitive and inductive qualities, which includes fancy materials. This, BTW, is why you often see audio engineers questioning the high end audio cable industry- audiophiles often use RCAs and due to the high impedances, need to use fancy cables to make things sound right. In the studio, everything is low impedance (although not always 600 ohms) and balanced, in that world the cables really *don't* make a difference.
 
But I = V/R, so high current may just as well mean high voltage as low resistance.

Yes. If your power amp needs 10000V to drive its 10k input impedance, it will still be one amp. That seems to me to be an odd way to look at it, but chacun a son gout. For me, at a given voltage level, it's easiest to think of current as being high when impedances are low and vice versa.

the output impedance of the source which you left out in both examples.

Indeed I did. Amps with high source impedances cause frequency response deviations that totally swamp the effects of cables. But that was another thread and another controversy.
 
Amps with high source impedances cause frequency response deviations that totally swamp the effects of cables. But that was another thread and another controversy.

Let's put this in practical terms. In my system the amps are next to the side wall, and the speakers on the front wall (flanking a fireplace, which is why the amps can't be there). It's a fairly big room, so requires speaker cables of about 25'. I take wires from a 12 gauge extension cord (probably meeting Sy's definition of cheap). A 50' 12 gauge is about 0R1 (figures I get off the net vary somewhat, so I picked the high value). If my speakers dip to 2R, and the amp has a very high damping factor, then the result is a damping factor of 20. Most likely not a big deal. Certainly less than moving the speaker a foot. And tube amps? How many with output impedance less than a couple tenths of an Ohm?

Now, I have no problem going with longer interconnects, but at that length we are talking balanced interconnects. Maybe some can get away with unbalanced, but with my multiamped system, ground loops can be a problem with earth connections spread around. Most home stuff is not balanced. Yes, we can use transformers for isolation, and at some point I might. Even so, I find it a bit annoying to go around the room to turn stuff on and off - makes it too easy to forget something.

Sheldon
 
In the studio, everything is low impedance (although not always 600 ohms) and balanced, in that world the cables really *don't* make a difference.

No, it's not.

The 600 ohm "standard" is an irrelevant, antiquated throwback to telegraphy which found its way into telephony when early telephone lines used old telegraph lines, and from there into early broadcast and recording systems.

And the professional world began casting it off decades ago and it's not taken seriously by anyone serious anymore.

se
 
Regarding the pre amp/power amp interfaces controlling cables/speaker cables:

The pre amp which I will be using shortly has an output imp of 100 ohms - at what frequencies, I don't know.
The manufacturer states:
Up to 10m (33') each output (@30pF / ft).
This seems a good candidate to move the mono-block power amps to the speaker locations, driven from the pre amp at a distance of about 7 feet per channel.
I'll purchase some Canare RCA interconnects LV-77S at 21 pF/ft.
Does this sound like a good plan?


I think that all sounds very sensible.


The mono block amps don't list the output imp, but output 125 watts@8/250@4


If it’s a commercial amp then likely it will have a relatively low output impedance, possibly because of design philosophy or possibly because it is the accepted market norm.

The fact that the amp doubles power from 8 to 4 Ohms suggests a ‘true voltage source’ which often bodes well.


If theoretically, 0 dBFS from the CD data represents max out (there are cases where more output would be produced with inter sample overs), and the pre amp set to unity gain, and the power amp's sensitivity is @ 1.25 volts rms, then would it make sense to play the CBS CD Test Disk 0 dBFS track, measure the pre amp out with a true rms volts meter and set the pre amp out level to just a little under 1.25 volts by adjusting the DAC output trim?


The short answer is yes!!!

J.
 
Originally Posted by philmagnotta
If theoretically, 0 dBFS from the CD data represents max out (there are cases where more output would be produced with inter sample overs), and the pre amp set to unity gain, and the power amp's sensitivity is @ 1.25 volts rms, would it make sense to play the CBS CD Test Disk 0 dBFS track, measure the pre amp out with a true rms volts meter and set the pre amp out level to just a little under 1.25 volts by adjusting the DAC output trim?

The short answer is yes!!!

J.

I was hoping for the long answer.
Seriously though, I suspect this is not usually the way it's done.
Care to comment a little more on my gain staging example above and if there are reasons for not doing this way.
 
Last edited:
Hello. I was not being deliberately stunted, I simply didn't think it needed a longer answer because your method seems to makes sense. It looks as if you have far too much gain in the system, and your method seems a reasonable way to deal with that.

One other possible way would be to balance the output of the DAC with the level setting of the pre, so that the pre is not used at the low end of the volume control (...because potentiometers are notoriously poorly matched at the low end. Not normally an issue with stepped attenuators).

Also, if you end up using the preamp volume control (of an active pre with conventional volume-control-first design) at the low end of the range you still get the full background noise of the circuit. This is because the incoming signal is attenuated and the subsequent amp noise is not. However in practice you may never notice it.
 
Why should a preamp somehow be more capable of driving long cables without degradation than a power amp? What's special about a preamp in this regard compared to a power amp?

You probably don't know that in order to drive long lines professional power amps used to have 100V outputs. And symmetrical 600 Ohm inputs. Both professional power amps and preamps were "capable".


Home audio uses 4 or 8 Ohm loads, though. But home audio is an exception, and home preamps as well can't drive long lines.

Modern professional amps have digital inputs, no outputs (they are mounted in sections of line arrays), but it is a different story.


So which is better SS or tubes .........? :magnify:

Both.

Pardon, I mean SS, Tubes, and Digital, all together.
 
Status
Not open for further replies.