Biasing a guitar amp for distortion

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Hi,

I wanted to ask for your thoughts to help me reach a conclusion.

If you want a guitar to make a screaming lead sound, you need quite a lot of valve distortion. Eddie Van Halen used to play using Plexi 100 watt amplifiers that where normally quite clean sounding, with not much distortion.

He used to run these from Variacs. He'd never really explain how he was using them or what he was doing to try and keep people from copying him. But I am almost certain he would use these to lower the plate voltages, allowing him to run the valves with much higher cathode currents. With the cathode currents closer to their limits, they would produce more signal distortion as the amplifier was turned up. This gave an excellent sound to his guitar! Really warm, smooth and nasty. All be it, at the expense of the valves.

He used to call it the brown sound, a bit too similar to voltage brown out. He then said he called it that because of something to do with his brother's drum kit.

I am looking to do something similar with a small home made 12AX7 preamplifier.

Looking at the grid curves for a 12AX7, the maximum cathode current is 8mA. The grid curve only goes up to 6mA, so I'll go with 6mA for the rest of the example.

To pull this amount of current through the valve with 0V on the grid requires roughly 280V on the plates. This means that at 0V, the anode would need to dissipate 0.006 * 280 = 1.68W. But each half of a 12AX7 is rated for 1W of dissipation, so I'm .68W over.

However, the dissipation rating is for DC current right? Since I am running an alternating signal into the grid, and it'll only reach 1.68W of dissipation on the peak of the waveforms, I'm effectively running the valve in a pulsed like mode beyond it's limits; not continuously.

If I was to fit a small fan over the valves, and maybe some heatsinks to the cases, I should be able to run them with plate dissipations much higher than they're rated for, so long as I keep the time spent there low, right? I have read about this idea on other sites but in different forms, such as biasing to 70% idle current instead of 50%.

I thought it might be helpful to try playing my guitar into my scope to see how long I actually spend with the output at it's peak. From the rough tests I did a while ago, I believe the output voltage of the pickups actually falls off quite quickly after strumming a note or chord to a much lower value.

Relying on the grid saturating at 0V + for distortion in guitar amps can cause problems, since the grid begins to conduct and causes the gain to start wobbling. A lot of high gain, home built amps suffer from this problem, with the gain going from insane to much less as the grid is saturated. Instead, I want more of my distortion to come from the cathode being overdriven.

I've been so busy it's been months since I last looked at amplifiers, so I need a quick check. Am I going totally off on a tangent with this or does it seem correct?
 
Afaik, you can't "over drive" a cathode. The current through the cathode depends on that current being dissapated by the plate.

Reducing the B+ on the plate of the tube will reduce the swing and permit you to run somewhat higher currents - also changing the Rp at the same time... You can pretty much freely bias the tube anywhere in the range as long as the DC (quiescent) current results in a total POWER dissapation on the plate that is under the max for the tube (disregarding any issues like tube life, which may or may not change depending on what your actual set up is...)

There is nothing I know of that you can do that will drive a capacitively coupled grid past "0", so the tube will look like it is clipping at that point no matter what else you're doing.

So, in brief what you want to do is to *listen* to the various combinations of drive (input level), plate load, B+ and bias points and decide what sounds good to your ear for a given tube...

Other than that I know of no prescription for "magic" tone. You could put the thing in SPICE and try for maxing ur 2nd & 4th harmonics... that's an idea...

What Eddie Van Halen did is probably butt simple - and actually more mystique and myth than anything else. I'd guess that he merely had the Variac to set his line voltage level properly, since it varies all over the map in live venues... some bozo tech probably set that up for him... If he had the variac on the AC mains of any stock amp - one with filaments AND B+ on the same xfrmr, the amount that he could reduce the AC mains and not cause the filaments to drop to the point where they weren't hot enough wouldn't be that far. But maybe that was his entire trick... so put a variac on the filament xfmr of your test set up and see if the sound changes when the fils are barely lit... :)

_-_-bear :Pawprint:
 
Another thought...

As you lower the Variac on the mains, here's what happens to Eddie's amp:

Assuming about 10 - 15% line voltage drop being about max...

B+ drops from ~450v to ~<390vdc
Bias "drops" from ~ - 50v to ~<-38v
Tube operation goes from more or less class B/AB to closer to Class A
Total emission from the cathode decreases - so max power decreases there.
Power out is decreased because the B+ is lower and the bias is higher (more into class A).
Max power out is decreased as the B+ is lower - so is the max voltage swing.

All the preamp tubes - in class A - are running on lower B+ but self bias, so they look about the same operation wise... but clip sooner due to the lower B+

If this is how a player wanted to run his amp, it would be better to just have them set up with the requisite voltages in the first place, not use a variac, imho. Which, is why I think it is/was hype and he was really just using it to set the line voltage so that the way the amp *was* set up *was* the way it ran every day...

:D

_-_-bear :Pawprint:
 
A cathode has a discrete, set, surface area. The surface area can only emit so much current because only so many electrons can boil off the surface of the electrode's coating in a set period of time.

The distortion of his amplifier was particularly distinct, which is why I don't think it's coming just from having the grid run to 0 and cause signal clipping.

My thoughts are that he was perhaps running the valve in such a way that the cathode reached it's maximum current limit before the grid came to 0V, such that the cathode became the clipping element before the grid.

The difference between solid state and valve distortion is the rounding effect that occurs in valves, whereby the valve smooths over the clipped edges as the grid cuts the signal off. I expect the cathode would have an even smoother clipped edge in comparison to the grid cutting off. Which would explain the very smooth distortion of his Plexi.

Not a lot of people, if any to the best of my knowledge, are able to make their Plexi sound similar to Van Halen's, which makes me suspect he was doing something more complicated than a lot of people are happy with.

If he was running the amplifier at a lower voltage through the variac, he'd almost certainly have had a separate winding or transformer for the elements and then rebiased the cathodes for a higher current, or he'd have lost power. In an interview he mentioned something vague about running the valves very hot.

The level of distortion the amps were producing wasn't anywhere near what they would have managed off the shelf. Plexi's are the same amps Jimi used to play through, where as Eddie's sound more like 80's or 90's rock amps.

I don't know, the Variac and numerous mentions of current and hot valves make me wonder is all! :)
 
I think you are making it overly complex and overly simple at the same time. From time to time they discuss Eddie's sound over at www.ampage.org if you want to pursue it.

The distortion in Eddies sound and anyone else's is not just in the output section. I would describe MArshall plexis many ways but clean is not one of them. Cranked inputs yield all sorts of overdrive distortion. This is the kind of distortion that is still there when you turn down a master volume or use a pedal. Then there is output tube distortion that only occurs when the power tubes are fully cranked. Part of the EVH sound was due to the fact that he played at Godawful loud levels. This affects the sound not only from power tube distortion, but also from the acoustics of his speakers and also his strings/pickups playing in the intense sound field.

What you hear recorded won't happen at lower levels.

So get all you can out of the preamp, then work on the power amp. Just keep in mind that preamp tube distortion is a totally separate issue from power amp distortion. We adjust the bias to 70-75% of max dissipation at idle as a matter of course. In his Variac days, Eddies tech was not just setting up the AC mains to "proper" voltage, they were indeed browning out the amps for effect. Get a Variac and dial down the mains voltage and experiment. Don't overanalyze it.
 
Thanks for the link, I think I might have read the page before, but not for a few months at least.

His reasoning, however, simply futhers my belief that the Variacs are being used for attaining higher cathode currents. My answers to his reasoning...


1.) Reducing the AC input will reduce the B+ to be sure, but you will also reduce the heater voltage by the same margin. This can cause 'cathode stripping' of the tubes1 (if the heater voltage is far too low).

A lower filament voltage will mean that it doesn't heat up properly. If the cathode isn't hot, it won't emit electrons anywhere near as quickly. So, by far too low, I take it he means virtually off, since this would cause the tubes to be exposed to the HV more frequently.

Also, if you are in possession of a Variac, I would assume you know what a filament winding is and could easily afford to buy one to heat the filaments independantly.

You are also reducing the bias voltage as an added 'bonus'. Unfortunately, the bias voltage does not track the plate voltage proportionately. And we all know how anal the Internet 'guru' is about the 'proper' bias voltage.

Of the guitarists I know of, more of them know that valves have a bias voltage than know how, or why, the valves are heated in the first place.

This problem could be overcome by... I dunno... rebiasing the amplifier?

Filter capacitors tend to have a 'memory'. Running a 500-volt capacitor for any extended length of time at 400 volts will make the capacitor think it is a 400-volt capacitor. The next time you plug your Marshall straight into the wall AC and put 500VDC on a capacitor that has been brainwashed into believing it is a 400VDC capacitor could be trouble (if the capacitor fails to 'reform'). I have been told this is a remote chance, but I chance I don't want to take.

Remote being the word! :D

Electrolytes explode when the electrolytic gel boils. Switchmode power supplies put huge amounts of ripple current through their filter electrolytics and seem to work fine.

Also, this guy should consider the implications that statement has for variable power supplies. Or indeed, any application that involves changing the voltage on an electrolytic.

Using an 'ungrounded' Variac can cause serious bodily injury, and even death.

And the mains won't right?

There are two very good reasons not to use a Variac to raise the AC input voltage to your Marshall. You will ruin your tubes2

?

You accelorate the wear of your valves just turning the amplifier on. Rebiasing your amplifier using a variac, within the valve's operating region, shouldn't accelorate that wear any more than it would be in any other design using the same operating parameters.

and possibly your output transformer.

A good reason. However, very few guitarists take the same care of their gear as audiophiles do. An output transformer is a lump of iron in their amplifier to a guitarist. So long as it's not smoking, it's probably working okay. Unless your transformer is wound to work just on the boundry of the valve's output, it's unlikely it will be damaged rebiasing the amplifier.

-----------------------

These kind of statements just make me think that this guy is either trying to stop people from hurting themselves, or is trying to hide something he's doing. Either way, he's lying to some extent; because he shows that he knows a lot about valve amplifiers elsewhere on his site.

Variacs aren't something you can pick up at the guitar store, and almost no guitarist knows what they're for anyway. Most guitarists are also adults, who can take a 'You could shock yourself' warning. Which leds me to believe my second guess is correct.
 
The guy on the Marshall amp page is slightly confused as well.

The quote he gives from Radiotron refers to *DIRECTLY HEATED* filaments/cathodes, not indirectly heated. Increasing the fils on an indirectly heated tube within reason will have little effect.

On a directely heated cathode, will have an effect.

The issue of the variac... just a bad way to get "tone". Assuming
an enclosed Variac with AC plugs & sockets, knock urself out.

As far as "increasing cathode current" - there is no possibility of change in cathode current without a comensurate change in PLATE current. They go together.

There are only a few ways to get more cathode or plate current, and that includes changing the bias, the load impedance, the plate voltage (up for an increase with all other parameters being the same), and that's about it. There's nothing else there.

Starving the fils will *reduce* emissions, if the fils are starved enough, and the tube will be reduced in terms of what it can do, not increased.

Increasing ur line voltage on a Marshall where the tubes are being run close to their design max (or over) and the caps are at or near their design max is a receipe for problems.

A decrease of ~ 10% is probably actually a decent idea.

The idea of "rebiasing" the tubes back to their original bias point on the curve after a reduction of the line voltage ("brown") is silly, since most of the change in tone is likely to to the effective reduction in negative bias voltage - shifting it back is probably the opposite of the way you want to go.

What this is saying is that you'd like to run the tubes in class A - and someone posted that they are run at 70%, if this meant 70% ON, as in past Class A which is ~50%, sorry no way - you can't. Not at those plate voltages, with those tubes, and not for long. Those tubes will glow and melt in Class A at ~500vdc on the plates.

IF you reduce the plate voltage, you have a shot at a genuine class A type bias, but with the loss of about 1/2 or more of your total output power.

Which brings me back to what I said in the first place. If you *really* want a Class A output stage "sound" the way to get it is to slam a different power tranny in the hole, (lower B+) and rebias ur tubes. No need to drop the fils at all.

If you want to dork your fils, have a tech put in some regulated fil supply components with a pot on the rear or front panel to control the fil voltage at will. You'll get less hum from the amp, worst case. :rolleyes:

Modify the amp properly, if you want to mod it. Dorking with an external variac to get "tone" is kinda like putting a valve on the gas line to your car to make it go the right speed... sort of the right idea, but not.

_-_-bear :Pawprint:
 
Bear, the Variac idea certainly isn't to move the circuit closer to class A, it's to move it futher towards class AB.

There are very few amplifiers running purely in class A. It's not only a problem achieving true class A operation at the volume require to compete with a drum kit using valves, it's probably actually a step back from the sound a lot of people expect an electric guitar to make.

The distortion of an amplifier helps mask some of the mistakes, like fretting inconsistencies and picking differences, that a class A amplifier shows up.

Van Halens Plexi's were usually quite clean, closer to Class A sounding when stock. Even though he was running them incredibly loud, they still had far more distortion than they should have had. Positive distortion, that sounds excellent on a guitar.

Guitar amplifiers and audio amplifiers begin to split apart at distortion. The only similarity is the argument over the induced distortion of a valve Hi-Fi being more pleasant than a SS Hi-Fi by the members of the forum who prefer valves, and vice versa.

Similarly, the different origins of the distortion in a guitar amplifier effects it's sound. For instance, valve distortion is usually much smoother and rounded sounding than solid state distortion, which sounds more like fuzz or grit. Valve rectifiers introduce sag distortion, that rounds the notes together and makes the amplifier sound, in my opinion, too smoothed over. Combined with a lot of valve induced distortion, chords end up turning into mush. I actually prefer solid state rectifiers to valve rectifiers at the moment.

The Variac idea is easier if you think about it in reverse. If you keep the voltage constant and increase the current, you increase plate dissipation, probably beyond the valve's rating. To stop the plates melting, you need to lower the plate voltage in accordance with the cathode current.

When you start doing this, you simply add a second transformer to the arrangement for the filaments, so that you can run them on their own socket, separate to the plates.

Like I say, the distortion of Van Halen's amplifiers was incredibly smoothed, but not mushy. The chords came through crystal clear, but the individual notes didn't bite into your ears. To obtain the level of distortion he did, with those amplifiers, he must have at least had them rebiased. If he rebiased them for higher currents, he will have had to reduce the plate voltages to avoid exceeding the plate dissipation wattages.
 
I don't think Van Halen cared about short tube life. If the tone was there, as new set of tubes each day was a small price to pay. And I really think the experimentation they did was not based on calculations and math. It was more a "hey lets try this thing" approach. As much as we talk about it, I have no idea which tunes include variac enhanced tone anyway. There are web sites devoted to that stuff.

I son't like the claim that the distortion hides playing flaws. Maybe inexperienced players hide behind the noise, but I don't know any real players who use distortion for anything other than tone or sustain.

One thing to always keep in mind is that there is a fundamental difference between hifi and guitar amps. Hifi amps are used to reproduce sound, while guitar amps produce sound. The guitar amp is part of the instrument. Guys pick their amps based on the sound of them, whereas hifi listeners pick their amps based on their having no sound of their own. it is perfectly common and legit for a guy to say "I play a Marshall." The amp and speakers have a tremendous effect on tone. AS of course do choice of guitar, pickups and strings.

Ironically, it takes a good hifi amp to reproduce the distorted sound the guitar amp produced when you listen to the recording at home.

Marshall amps are known for their overdriven distortion, but are not know for a clean sound. Fender on the other hand doesn't do distortion very well but has a killer clean sound. Yes, these are my opinions, some may not agree. I think class A is the last thing on their minds at Marshall.
 
bear said:
What this is saying is that you'd like to run the tubes in class A - and someone posted that they are run at 70%, if this meant 70% ON, as in past Class A which is ~50%, sorry no way - you can't. Not at those plate voltages, with those tubes, and not for long. Those tubes will glow and melt in Class A at ~500vdc on the plates.
Class A is 100% on, ie full cycle of the waveform through each tube all the time, the tubes never switch off.

70% refers to the plate dissipation at the chosen operating point.

If 500V B+ is within the rating of the tube, say an EL34, you can run them in class A all you want, provided that the plate and screen dissipations are within limits. Hell, you can even run an EL34 triode at 500V.
 
eeka-chu,

rather than discuss endlessly what you can never know, ie exactly what EVH uses to get his tone (but a hint is, most of it's him), build a copy of whatever amp of his you want, then spend a lot of time futzing around with the operating point etc till you get what you want.

I have a more-or-less 5150* copy on my bench at the moment, built on a big open chassis with lots of pots and switches to be able to adjust everything. The guitarist I'm building it for is completely stunned at the changes in tone I wring out of it by adjusting op-points and a few other parameters. When he finally makes up his mind what he wants for the two channels, I'll have it built into a nice box.

* it started as one as that's what he thought he wanted, but it's morphing into something else completely.
 
variac

I am close to a guy (I won't name) but who is extremely famous and has a collection of old plexi's and uses a variac on them. He uses the variac to bring them up to the correct voltage over a period of about 30 seconds because he thinks they will be more reliable that way. Plexi's tend to blow up, as you may know. Also, the comment about venue voltage is quite true. I have melted tube amps at concerts only to discover that the supply voltage was 150 volts being supplied by an out of spec generator rented by the production company. In one case the amp got so hot it melted the hot glue holding the covering onto the amp head.
 
Good idea! I'm thinking of doing just that. Building something similar that I can play around with. I'm getting really tired of looking at curves and numbers.

I wouldn't say someone who plays with a distorted amplifier is cheating in some way, but there is a definite playing difference between the two. If you're playing through a clean channel, any slight changes you make will be very obvious. So it's good for quiet, soulful tunes that are perhaps just the guitar alone. If you turn the volume up really loud, the hearing sensitivity curves levels it off a bit, so the changes won't be quite so obvious.

But playing anything quickly or complicated on a clean channel is just annoying, because so much of the more complicated fretting doesn't involve constant picking, but using your fingers to actively play the strings. If you try doing that through a clean channel, at quiet room listening levels, there is almost no way the amplitude or sound of a picked note can compare with the sound of one played with the left hand's fingers.

Perhaps more than distortion making it easier is the gain. If you're playing something complicated with your left hand, huge amounts of gain push the notes closer into the levelled off area of hearing than earlier.

Not enough people realise now, especially with modern music, just how much is done to it to try and smooth it out a bit. For instance a normal guitar through a clean channel, with no effects, sounds absolutely nothing like The Darkness. Add lots of gain, distortion, thousands of dollars worth of time based effects and you have something more like it.

Not saying it's bad thing to use effects or anything other than a clean channel, kind of the opposite, but not a lot of people pick up on the way no one's guitars ever feeds back during live TV performances anymore. I'm aware of anti-feedback units and noise suppressors, but that's not a good enough explaination. My amp whistles painfully loud before the volume is ever getting anywhere near 4 or 5 on the dials. Ten will produce unbearable feedback the second you take your hand off the strings.

Also, that no ones guitar string brakes on stage anymore. Their guitar is always perfectly in tune, a physical impossibility with most guitars. The band is mixed to CD quality levels, again, realistically impossible. And most obvious of all, but often over looked, that the singer is always precisely on key, has seconds worth of delay and reverb on their voice or can have more than one voice. If they have a harmonised voice, they're almost certainly also using some kind of pitch correction, since harmonisers are pitch shifters.

I actually prefer to hear bands play out of tune now, because I can be sure that they're actually playing live! :D

You're right about selecting an amp to match a sound. In fact, even the speakers make a very big change to the sound because they're run so close to their limits a lot of the time. They're purposely forced into distortion. Although, a lot of guitarists don't get round to changing their speakers as soon as their amplifiers.
 
Power Scaling

What has been described is what Kevin O'Connor of London Power (Canada) describes as Power Scaling. Its a way of getting "full tilt" power amp distortion at lower volume.

To implement it you need to run filaments from a separate transformer so that they stay at their correct voltage.

The High Voltage and Bias Supply Transformers Primary is then connected to a variac so that the High voltage supply and the bias supply track together.

Never ever use a variac (autotransformer) by itself - there is no isolation from the mains supply in a variac - SAFETY FIRST.

It can also be done by using tracking adjustable regulators on the High Voltage and Bias Supplies. Then bias supply will always be approximately 1/10th of the High voltage supply.

You can buy a kit from London Power which has all the components for the tracking regulators - instructions with the kit tend to be a bit on the light side - so you will want to know what your doing.

Cheers,
Ian
 
Brett said:

Class A is 100% on, ie full cycle of the waveform through each tube all the time, the tubes never switch off.


Exactly.

70% refers to the plate dissipation at the chosen operating point.

If ur running ur output tubes at a plate dissipation of 70% of design max rating - have fun!

Class A would optimally be at 50%. That permits the power to range UP from the quiescent level and down equally. Eh?

One could, of course not, achieve 100% swing, and limit the drive so that you run at 70% of max plate current as the quiescent level (bias point), but that would also reduce the max power available to the load, and caus asymetrical clipping, which in this case might be what you want.
Or not.

But none of these tube amps are run in class A.

If 500V B+ is within the rating of the tube, say an EL34, you can run them in class A all you want, provided that the plate and screen dissipations are within limits. Hell, you can even run an EL34 triode at 500V.


Yeah - IF.

There are no EL34s coming in from Russia or China that are really going to just plop into the same socket that a GE or RCA 6CA7 or Mullard EL34 was originally intended. The import tubes just aren't quite up to the task - at least not for long. And certainly not at the power levels for the original designs.

The screens on the current production tubes just aren't going to stand up, the grids won't and the plates go soft and melt.

You need to have the screen voltage up high enough or else you won't produce the requisite power. If you limit the current or voltage on the screens to a level that the tubes can take you'll see that the power has dropped off - and doing this is a mod, not
a stock circuit.

_-_-bear
 
eeka chu said:
Bear, the Variac idea certainly isn't to move the circuit closer to class A, it's to move it futher towards class AB.


Sorry, ur confused.

If you drop the AC mains voltage, two things happen:
- the plate voltage drops
- the bias supply reduces in voltage
(forget the filaments for this part)

The main effect is on the bias supply. IF you DROP the voltage on the bias supply the effect is to INCREASE the plate current. This is running the tube more toward Class A.

If you *increase* the bias voltage (greater NEGATIVE volts) then the tube is more "cut off" and more into AB.

The effect of reducing the B+ on the plate, makes it possible to increase the tube's quiescent current without exceeding the tube's plate dissapation capabilities. It also changes the Rp, but that is another matter.

Overall, it is likely that the sum total change is minimal, but edging slightly toward increased plate dissipation when the AC mains are dropped in voltage

There are very few amplifiers running purely in class A. It's not only a problem achieving true class A operation at the volume require to compete with a drum kit using valves, it's probably actually a step back from the sound a lot of people expect an electric guitar to make.

Nice theory... there are some guitar amps in Class A - but as you noted, the Class A amp is bigger and heavier for the same power out. Soundwise, perhaps it is better by a wide margin... but that's another thread.
<snip>

The Variac idea is easier if you think about it in reverse. If you keep the voltage constant and increase the current, you increase plate dissipation, probably beyond the valve's rating. To stop the plates melting, you need to lower the plate voltage in accordance with the cathode current.

Why the fixation with cathode current? It's plate current that is in the tube books... And what you just said is exactly what I said earlier, minus the reference to "cathode current".


When you start doing this, you simply add a second transformer to the arrangement for the filaments, so that you can run them on their own socket, separate to the plates.

Which, again is what I said in the first place - but why not just
put in a new plate transformer and just rebias where you want thing rather than dork about with a variac - sure use the variac to figure out your operating parameters if you like - ??


_-_-bear :Pawprint:
 
bear said:
If ur running ur output tubes at a plate dissipation of 70% of design max rating - have fun!

By 70% Pd, I meant the max plate power (I never mentioned plate or screen currents), which is about where most commercial MI amps are biassed, and is what was being discussed in the Tone Lizard page mentioned earlier. Some hifi designs go higher, and I find the best sonic/lifespan compromise to be at circa 80% in hi fi amps.

Class A would optimally be at 50%. That permits the power to range UP from the quiescent level and down equally. Eh?

No. Class A is 100%. See above for the distinction between bias current and plate dissipation.

One could, of course not, achieve 100% swing, and limit the drive so that you run at 70% of max plate current as the quiescent level (bias point), but that would also reduce the max power available to the load, and caus asymetrical clipping, which in this case might be what you want.
Or not.

I never mentioned sonics, only operating class. Sonics is a whole 'nother discussion. See above.
 
This was really bothering me...

So I spent a little time scratching my head, and checking some things, just to see what was what...

Here's what the Sylvania Technical Manual says about Class A:

"A Class A or Class A1 amplifier is one in which the grid bias and the signal voltages are such that plate current in the tube or in each tube of a push-pull stage flows at all times. This is accomplished by operating at the center point of the plate current vs. grid voltage curve and using signal voltages which do not drive the grid into either the positive region or into the sharp bend near cut-off voltage."

Well on the surface, if the plate wattage, maximum rating *is* say 30 watts, that implies 1/2 of that power at quiescent bias for Class A operation. Since there is no way to get *more* than the maximum power out of the tube, as it can not be driven past the "O" volt grid bias point in A1 operation.

Yet, when we look at say a 6L6G operating point for A1 operation, it clearly shows that the tube is being biased *at* 30 watts (the "max rating")!

So, I looked at two very similar tubes: the 845 and the 838 - they are essentially the same tube with different grid spacing. The former is a Class A type tube requiring negative bias, and the latter is a "zero bias" class B tube.

But when one looks carefully at the 838 spec there is an interesting spec shown:

Plate Dissipation : 100 max watts (as expected)
but then...
Max-Sig Plate Input: 220 max watts (!)


(ah ha! the latter spec is rarely shown in receiving tube books)

Now the 838 in "class B2" is biased with a zero sig plate current of 148 ma., while the max sig plate current is 320 ma.! With 1250 volts on the plate.

The 845 is biased in class AB1 with a zero sig plate current of just 40 ma. and a max sig plate current of 240 ma. Also with 1250 on the plate.

The 838 produces 260 watts out, while the 845 makes only 115 watts. Which is expected since the 838 gets to run into grid current in B2.

The 211, a cousin to the 845, in class B2 sits at only 20ma and swings to 320ma, with 260 watts out. Yet it's plate rating is "75 max watts"!!

The 805, again a cousin but to the 838 sits at 148 ma and swings to 400ma, also at 1250vdc on the plate for 300 watts out. The plate dissipation spec is "averaged over audio frequency cycle" according to the note and is 125watts max. While the Max-Signal Plate Input spec is 315 max watts.

So, this is a useful group of tubes to look at since their construction is very similar in most aspects, but their operation varies and shows some interesting details.

What I bring away with this is that my view of class A being at 50% is correct - but the proper terminology that applies to the 50% figure is not "plate dissipation" but rather "Maximum Signal Plate Input".

The latter being the most useful in terms of understanding the operation of the stage, and determining the maximum useful output of a given tube.

It would seem that the commonly used "Plate Dissipation" figure is actually reflective of the maximum DC quiescent power that the plate can handle steady state, not the maximum swing the tube is capable of - effectively it *is* the class A bias point if the tube is run with as much power as it can handle.

So we actually agree once the terminology is cleared up.

:D

_-_-bear :Pawprint:
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.