• WARNING: Tube/Valve amplifiers use potentially LETHAL HIGH VOLTAGES.
    Building, troubleshooting and testing of these amplifiers should only be
    performed by someone who is thoroughly familiar with
    the safety precautions around high voltages.

Voltage VS. Amperage?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Hey All,

I understand that actual power is the product of voltage and amperage. And that a tube's limitation is the amount of wattage a tube can dissipate. My question is does the balance of voltage and amperage really matter? I have the impression that higher voltage has better sonic characteristics. But is that in fact true?

The reason I ask is if you have a power supply with a B+ of 185 volts which is shared by both the voltage amplifier and the output tubes it would make the power supply much simpler. And you wouldn't have the resistors in the power supply current path to waste energy.
 
The reason plate voltages of a particular value is used is purely a design function of the valve chosen. A valve that flashes over at 450v on the plate is the limitation. Some people try to run ECC83 valves with 60volts on the anodes. They have around a 40volt Anode to Cathode voltage drop, so not too good, they prefer 180v on the plate. It is all a matter of maximum dissipation versus maximum voltage ratings.
 
Thanks, maybe I should be more specific. I was thinking of an RH84 using a pair of SV83's and a ECC81. Doesn't damping come into this somewhere?

Both the SV83 and ECC81 should be run in their respective linear regions. It would be a coincidence if the two tubes were optimised for their best supply voltage and it was the same voltage. OTOH, it is plausible that one could find a supply voltage that would work for both.

Even if you find that both tubes work well with any particular supply voltage, it would be best practice to separate them with a resistor and cap to decouple them. This might not be necessary with two stages of equal current but out of phase signals.
 
High voltage designs today are often inherited from old-fashioned design attitudes. Back in the days of tubes it was a lot easier to buy components that could handle high voltages than high currents, so it was common to run your output tubes at the highest voltage you could get away with, but at low current, to get the desired output power.

These days the opposite is true: high current rectifiers and big smoothing capacitors are commonplace, but components with really high voltage ratings tend to be much less common. It is now much easier to run things in the 250-350V range and crank up the current instead, to get the desired power.
 
I have the impression that higher voltage has better sonic characteristics. But is that in fact true?
Not always. It's often true where (interstage or output) transformers with large step-down ratios are hooked to high-Rp tubes. As the transformer is acting as both impedance converter (and effectively, voltage-to-current converter), it is often appropriate to have high voltage swings (and therefore a high voltage supply on the tube) driving the transformer, so that lower current can be used on the transformer's primary (providing more headroom before saturation).
There are also examples of tubes whose most linear range of operation is situated near the top of their maximum voltage range, but this is far from universal.
...with a B+ of 185 volts which is shared by both the voltage amplifier and the output tubes it would make the power supply much simpler. And you wouldn't have the resistors in the power supply current path to waste energy.
You would still "waste energy" in the plate resistors of the voltage amp sections. In fact, the "drop resistors" from the highest voltages to lower voltage stages keep output voltage swings from section to section more reasonably in accordance with what the input of the next stage wants to see.
 
Last edited:
it is often appropriate to have high voltage swings (and therefore a high voltage supply on the tube) driving the transformer, so that lower current can be used on the transformer's primary (providing more headroom before saturation).

This is back-to-front. It is easier to make a transformer that has a small turns ratio, not a high one. It is primary voltage that leads to saturation, not current. Cheaper, better-quality transformer design is is yet another advantage for lower-voltage, higher-current circuits. The notion that higher voltages lead to better fidelity is a myth that beginners often get saddled with; not sure where it comes from though...
 
If you are going to run the preamp/driver tube off the same voltage as the output tube(s), remember to run separate RC fiters for each.

If you power all from the same filter cap output, you will get more hum from the power supply.

You can also get positive feedback (and possibly motorboating) from the output to preamp stage if you have three stages.
 
Disabled Account
Joined 2010
Hey All,

I understand that actual power is the product of voltage and amperage. And that a tube's limitation is the amount of wattage a tube can dissipate. My question is does the balance of voltage and amperage really matter? I have the impression that higher voltage has better sonic characteristics. But is that in fact true?

.

Yes V*A (I)=W<<heat/power

However you ask a strange question. Do components sound any different when warm? Does a cold amp make good sound?

Some components seem to last longer if run warm. Think about getter flash as an example. Think about some wire wound resistors etc. Water ingress is another example. I prefer more current in tubes I think they "sound better" but that's just a preference.

Tube temp is not just heater power its anode dissipation.

Regards
M. Gregg
 
One thing to be careful of is not to exceed the cathode rating of a tube. Just because the anode dissipation is not exceeded when you decrease the anode voltage and increase the current, it does not mean the tube life will be as good.

The cathode is limited as to it's current emission based on cathode geometry and coating. Try to drive too much current through the tube and you risk damaging the cathode (stripping) which results in arcing within the tube and destruction of it and the supporting circuitry.
 
This is back-to-front. It is easier to make a transformer that has a small turns ratio, not a high one. It is primary voltage that leads to saturation, not current.
In SE (gapped) transformers, Ip0 and core size determine your F3 at given rate of distortion – check any vendor's list of SE OPTs and you'll see current ratings (typically at 30 Hz/5%. distortion) for each one. AC voltage can saturate a transformer coil, but DC standing current is determined by core size. If one wants to keep size down, one must use lower DC currents (entailing higher voltage to achieve max. dissipation).
Also, bear in mind that many valve designs were/are based on classic RCA Receiving Tube Manual, Loftin White, and other circuits from the "golden age" of hi-fi, when higher voltages were more prevalent.

That being said, I agree that there is no necessary benefit to higher voltage; I was merely trying to explain how it started, and why (to some extent) it persists.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.