• WARNING: Tube/Valve amplifiers use potentially LETHAL HIGH VOLTAGES.
    Building, troubleshooting and testing of these amplifiers should only be
    performed by someone who is thoroughly familiar with
    the safety precautions around high voltages.

How to compute current at control grid of power tube?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
This should be an easy question, almost embarrassing to ask.

Let's say I have a power tube, Use a "standard" KT88 push pull audio amp as the example. I know what voltage I need to apply to the grids. It's in the graphs on the data sheet.

What I don't know is how to compute the current that must be supplied by the driver tube. What's in input impedance at the control grid? (I'm asking about the general case not for any one amp)

From experience and looking at various amp schematics I know what might work and I have Spice to simulate it, but that is not the same as knowing how to compute the values myself.
 
Tube grids are voltage driven. They are very, very high impedance and a bit (small) of capacitance. Anything from 100K to 1 Meg will work. Greater than that , approaching 10-22Meg, you get into grid leak bias, which probably isn't what you want. All this changes if you drive the grid positive wrt the cathode and draw grid current.
 
...Anything from 100K to 1 Meg will work. Greater than that , approaching 10-22Meg, you get into grid leak bias...

This sounds like to are saying how to select a grid leak resistor.

Maybe I was not clear. What I don't understand is how much current goes into the tube. It would seem to me that a tube like a 6L6 or KT88's control grid would have almost infinite input impedance. But if that were true then I could drive a dozen power tubes with a single 12AX7. I know that can't be done. I'm needing to know how to compute the current that a driver tube must supply.
 
It depends on the input capacitance and the desired bandwidth. In the case of pentodes, which have very low input C, you can indeed drive a whole pile of them off a single small tube.

Calculate input capacitance. Choose an f3. Then you can get the required current. Let's say you have 20pF input capacitance and you want the f3 at 20kHz at full power. The tube is biased to -40V. We're running class A or AB1. f3 = 1/2piRC. R = V/I. So, with a little algebra, I = 2piVCf3 = (6.3)(40)(20E-12)(20,000) = about 0.1mA.
 
But if that were true then I could drive a dozen power tubes with a single 12AX7.

Nahh, of course you can.

You will loose the top edge of frequency response (which will be worse in triode mode), and the required grid leak resistor will cause significant loading, reducing gain and increasing distortion.

Most audiophiles choose a ridiculously powerful driver, because it "feels right". This results in bandwidth on the order of 200kHz, which means the output transformer and feedback network will dominate the overall response. That's better than having the phase shift of repeated poles.

Back in the day, 6L6s were regularly driven from 12AX7s, since it works for the specified distortion and frequency response (maybe 5% at full power, and ~20kHz upper limit, -3dB).

Tim
 
This should be an easy question, almost embarrassing to ask.

Let's say I have a power tube, Use a "standard" KT88 push pull audio amp as the example. I know what voltage I need to apply to the grids. It's in the graphs on the data sheet.

What I don't know is how to compute the current that must be supplied by the driver tube. What's in input impedance at the control grid? (I'm asking about the general case not for any one amp)

From experience and looking at various amp schematics I know what might work and I have Spice to simulate it, but that is not the same as knowing how to compute the values myself.

Hello,
I have trying to wrap my brain around similar thoughts the last several days. My thoughts have included grid current among other things perhaps unimportant detours. This is about getting the driver output into the power tube as cleanly as possible. What things need to be overcome to get there; the time constant of the coupling capacitor and drain resistor, the current flow into the drain resistor, the grid stopper resistor, the Miller capacitance of the power tube, and the current of the power tube grid. For A1 operation grid current is the least of our worries, it is all about voltage. For A2 or positive grid voltages that is where grid current becomes an important consideration.
Without an accurate mathematical model it is not possible to calculate A2 grid currents. Perhaps the best we have is empirical data. As noted in other posts that data for some tubes is on the tube data sheet.
SPICE models. If you want to make a project of modeling perhaps you can tweak the tube models to have improved performance in the low impedance positive grid voltage range. Even then it will be trying to get the model to match the empirical data.
Also look up tubelab.com PowerDrive. George has driven tubes beyond where tube data sheet has gone before.
DT
All Just for fun!
 
This should be an easy question, almost embarrassing to ask.

Let's say I have a power tube, Use a "standard" KT88 push pull audio amp as the example. I know what voltage I need to apply to the grids. It's in the graphs on the data sheet.

What I don't know is how to compute the current that must be supplied by the driver tube. What's in input impedance at the control grid? (I'm asking about the general case not for any one amp)

From experience and looking at various amp schematics I know what might work and I have Spice to simulate it, but that is not the same as knowing how to compute the values myself.

Repeating others here, in different words.
In Class A and Class AB1 greed current is negligible. Practically speaking, there is no greed current affecting the driver.
Most power tubes datasheets specify the maximum greed leak resistor value for fixed bias and for self-bias.
As long as you don't bias tubes in Class AB2, calculating the driver's current requirements is easy. It was given above.

Is that clear?
 
Repeating others here, in different words.
In Class A and Class AB1 greed current is negligible. Practically speaking, there is no greed current affecting the driver.
Most power tubes datasheets specify the maximum greed leak resistor value for fixed bias and for self-bias.
As long as you don't bias tubes in Class AB2, calculating the driver's current requirements is easy. It was given above.?


That seems like a good summary over what is above.

Thanks everyone.

This is what I was guessing, that really all you are driving are the passive components connected to the grid, grid leak, grip stopper and coupling caps. It seems that I could likely pull the power tube from it's socket and the driver tube will not see much difference. (Except for Miller effect capacitance going away.) What got me thinking I might be missing something was all the designs I saw with over-kill driver tubes.
 
Most audiophiles choose a ridiculously powerful driver, because it "feels right".

This is absolutely true and often justified "because powerful output tubes form a heavy load to driver/phase splitter". I have even seen cathode followers used to drive typical A- and AB1-class output stages.

All aboves fully groundless and proves that the writer has not right knowledge about tube output stages.

But back to the subjects.

As an simple example, I we had a PP-output stage biased with -30 Vdc at the control grids of the output tubes, then the driving AC-voltage may be up to some 58 Vpp without practically any grid current flowing into the tubes. So as long as the voltage at the grid stays below some -1...-0,5 V, the tube is a very high impedance load.

The need for powerful driver stages exist in the case when the grids of the output tubes will be driven positive i.e. in AB2- and B-class amplifiers.
These operating modes require power while in A- and AB1-class only driving voltage is needed.
 
To Joshua:



I would like to suggest an other determination.
Tubes will not be biased into class AB2, instead tubes will be driven into class AB2.
Do you agree ?

As far as I can see, it's two ways of choosing words to describe the same thing.
However, tubes aren't being driven outside the designed bias. That is, the designer has full control over how tubes will be driven.
 
This is absolutely true and often justified "because powerful output tubes form a heavy load to driver/phase splitter". I have even seen cathode followers used to drive typical A- and AB1-class output stages.


Tubes in output stages work in regimes close to limit, otherwise efficiency of power consumption is low. Dynamic range of music is wide, so peaks are much higher than average levels on normal listening loudness. It causes rectification of peaks, and shifted bias after them. Negative feedback makes this rectification even worse. It causes nasty blocking distortions. To avoid them cathode followers are used, so the same peak passages don't cause floating bias and caused by it dynamic distortions.
 
As far as I can see, it's two ways of choosing words to describe the same thing.

I do not want to make any argue about this, not at all.
I just have noticed quite many times that the real difference between AB1 and AB2-class is not clear to some hobbyists.
Therefore I want to be precise with expression.

I still want to take one example to clarify my point.

I we had a PP-output stage biased with -30 Vdc, then as long as the driving voltage is less than about 60 Vpp (at the control grids of the output tubes) the amplifier is operating at AB1 class and can be driven with ordinary voltage amplifier stage having high output impedance.

In case the driving voltage exceeds 60 Vpp, say to 75 Vpp. Then the output stage is operating at AB2-class. The required characteristics of the driver stages are now completely different than in previous case (AB1).
Now the output tubes require driving current after the grid voltage has gone positive. Therefore the driver stage must be capable of driving power into the output tube.
Typically a cathode follower is used in such case, since it has low output impedance.

Summary: In principle both AB1- and AB2-class output stages can be similarly biased, but the amount of driving voltage determines the operating class, not the bias.
 
Summary: In principle both AB1- and AB2-class output stages can be similarly biased, but the amount of driving voltage determines the operating class, not the bias.


Not really, or, let me rephrase it.

OP stage designed to operate in Class AB1 takes care for such a bias value, so that the positive peak of the signal will never get beyond certain value, say -1V.

When an OP stage is intended to operate in Class AB1 and the above condition isn't kept always, it is poorly designed.

OP stage designed to operate in Class AB2 allows the signal's peaks to drive the OP tubes' grids positive.

The difference between Class AB1 and Class AB2 is the value of the bias voltage in reference to the peak signal voltage on the OP tubes' grids.

Indeed, a bias of say -30V for certain type of OP tubes can be either Class AB1 or Class AB2, depending on the peak signal value at OP tubes' grids. However, properly designed Class, whether AB1 or AB2, takes into account the value of the peak signal.

So, again, the operation Class is being determined by the bias voltage in reference to the peak signal voltage. However, the designer has full control over what Class the OP stage will operate.

Another point is that one cannot choose a bias voltage arbitrarily, for the bias voltage should be such that the OP tubes will operate properly, as linear as possible.

Also, not all OP tubes can be used in Class AB2 without excessive distortion. So, for certain tubes, choosing Class AB1 is mandatory, if reasonable performance and distortion level are expected.

Is that clear?
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.