6L6GC / 6BG6 x4 Ultralinear Guitar Amp

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
This is interesting, because its the opposite of what I have thought:

I supposed that lower resistance meant a higher driving voltage at a given preamp/driver level (and smaller accumulated distortion from previous stages) and less strain on the first half of the amp.

I assumed that high resistances would cause not only voltage loss requiring higher voltages in previous stages for a given loudness (and more distortion), but also a much less linear behaviour by the output tubes, because of the 'voltage-divider' effect of the resistor in the circuit with the grid current.

Of course I have never seen any proper detailed explanation of the interaction of the components as the output tubes are driven into grid-current conditions.

The little I could gather comes from the strange curves (when makers bother to provide them) as grids are driven positive.

High impedance source (the driving stage) into a high impedance input, life is good. Get to 0V on the input and the high impedance turns into a low impedance diode drawing current. In comes the capacitor, blocking distortion. Go to Class AB if you do not like the rules of the game. Transformer coupled sounds ideal. Mind you, probably should run with lower voltages and higher current then.
 
And a Fender, Marshall, Vox, or any of their offshoots are not musical?

Fender: Bright - metallic, harsh at high volumes. Lots of 'clean' power next to a Marshall, but meh,...not musical enough.

Marshall: lots of lovely messy breakup overdrive, that driven sound. Sure, musical if you can play the right stuff, like Howlin Wolf. Would I own one? No. Why buy an amp so badly designed that I have to buy output tubes every 2 weeks?
Idiots.

Vox: I guess if I was playing the blues solo in a coffee house, this or some old tweed would be my choice. Lots of character, but the cliche' is now played to death, like a sitar on a Beatles track.

Not talking about overloading output transformers, its the tubes where the action is. Don't like sag? Guess you fall in line with the high gain metal guys.

You're right: the tubes are where the action is.
Thats why you shouldn't like a typical guitar amp,
with underpowered cheap output transformer,
with its flux mushing out like a fart from a tuba.

You mean power supply sag?
Sure that can be tasty.
In fact load it up with tube rectifiers,
the loss of power will be more than compensated by
the clean solid regulated voltage, with its smooth slide downward under increasing load.

High metal guys?
I loved Lee's explanation of how Metal was shipwrecked by record company formulas. Awesome Youtube fun.



Well if you like thin preamp distortion.
Its only 'thin' if you leave it in the hands of drugged up design-clowns.

I would prefer to roll my own thanks.

So it is a clean amp you want. Why not go SS?
If you want a clean amp, don't go solid state.
All solid state gear is shiite.
My tube amps blew Brystons out of the water in a straight A/B listen-off.


As said above, maybe tubes are not for you.

Guess I've been in the wrong business for 30 years...

Most designs now days take overloading the grids inputs in consideration. What may look like RF mitigation is more tone shaping and nonlinear operation control.

This is an interesting observation/claim.

Can you give examples, and document it a little bit?
I'd love to read about how some amp designer chose a grid stopper based on tested class AB/B performance.
 
Under normal circumstances the grid input impedance is high, so the circuit behaviour is determined by the output impedance of the previous stage and the grid bias resistor. When overdriving causes grid current to flow the only resistance is the previous stage o/p Z in series with the grid stopper. Unless the grid stopper is large this circuit can charge the coupling cap quite quickly but it then has to discharge slowly through the grid resistor. If you want to avoid blocking you need a small grid resistor and a large grid stopper, so the time constant hardly changes as grid current flows. This would cause the voltage attenuation you describe, so is not usually done.

If NFB is present the situation can get more complicated because the previous stage might cutoff, so its output impedance shoots up to just the anode resistor value. This might help, because it augments the grid stopper during clipping.

If you want to avoid blocking, then either avoid overdriving or use a proper AB2 driving arrangement.
 
Fender: Bright - metallic, harsh at high volumes. Lots of 'clean' power next to a Marshall, but meh,...not musical enough.

As I asked, which is musical enough? Heck of a lot of good music out there made with Fender amps. Which sounds metalic and harsh, the Tweeds or the Blackfaces?

Fender Harvard+Silvertone 1448 - YouTube
G & L Telecaster ASAT Classic Part 2 clean - YouTube

Marshall: lots of lovely messy breakup overdrive, that driven sound. Sure, musical if you can play the right stuff, like Howlin Wolf. Would I own one? No. Why buy an amp so badly designed that I have to buy output tubes every 2 weeks?
Idiots.

I know of musicians that actually make them last as long as a month. Imagine that?

Vox: I guess if I was playing the blues solo in a coffee house, this or some old tweed would be my choice. Lots of character, but the cliche' is now played to death, like a sitar on a Beatles track.

Don't care for the Beatles much myself.
18watt EF86 custom combo amp - YouTube

You're right: the tubes are where the action is.
Thats why you shouldn't like a typical guitar amp,
with underpowered cheap output transformer,
with its flux mushing out like a fart from a tuba.

You are playing the wrong amps.

You mean power supply sag?
Sure that can be tasty.
In fact load it up with tube rectifiers,
the loss of power will be more than compensated by
the clean solid regulated voltage, with its smooth slide downward under increasing load.

Well there is hope for you.

High metal guys?
I loved Lee's explanation of how Metal was shipwrecked by record company formulas. Awesome Youtube fun.

?


Its only 'thin' if you leave it in the hands of drugged up design-clowns.

I would prefer to roll my own thanks.


If you want a clean amp, don't go solid state.
All solid state gear is shiite.
My tube amps blew Brystons out of the water in a straight A/B listen

Well since you were saying you have not hear a tube amp you liked...


Guess I've been in the wrong business for 30 years...

I've wasted a lot of my youth also.

This is an interesting observation/claim.

Can you give examples, and document it a little bit?
I'd love to read about how some amp designer chose a grid stopper based on tested class AB/B performance.

Not sure you understood me. Grid stoppers are used to control the frequency response and the harshness that can happen when you overdrive the grid. Not sure what you mean by tested Class AB/A performance.


I think I will enjoy seeing what you will come up with.
 
Under normal circumstances the grid input impedance is high, so the circuit behaviour is determined by the output impedance of the previous stage and the grid bias resistor. When overdriving causes grid current to flow the only resistance is the previous stage o/p Z in series with the grid stopper. Unless the grid stopper is large this circuit can charge the coupling cap quite quickly but it then has to discharge slowly through the grid resistor. If you want to avoid blocking you need a small grid resistor and a large grid stopper, so the time constant hardly changes as grid current flows. This would cause the voltage attenuation you describe, so is not usually done.

If NFB is present the situation can get more complicated because the previous stage might cutoff, so its output impedance shoots up to just the anode resistor value. This might help, because it augments the grid stopper during clipping.

If you want to avoid blocking, then either avoid overdriving or use a proper AB2 driving arrangement.

Some work done with zeners clamping the voltage on cathode biased amps also.
 
Under normal circumstances the grid input impedance is high, so the circuit behaviour is determined by the output impedance of the previous stage and the grid bias resistor. When overdriving causes grid current to flow the only resistance is the previous stage o/p Z in series with the grid stopper. Unless the grid stopper is large this circuit can charge the coupling cap quite quickly but it then has to discharge slowly through the grid resistor. If you want to avoid blocking you need a small grid resistor and a large grid stopper, so the time constant hardly changes as grid current flows. This would cause the voltage attenuation you describe, so is not usually done.

If NFB is present the situation can get more complicated because the previous stage might cutoff, so its output impedance shoots up to just the anode resistor value. This might help, because it augments the grid stopper during clipping.

If you want to avoid blocking, then either avoid overdriving or use a proper AB2 driving arrangement.

This is the most clear and succinct explanation so far.

But I have some issues with the analysis:

(1) The current is literally flowing out of the tube when there is grid current. It can go in TWO directions: into the input Cap (at least momentarily, or for a time), and into ground (or into the Negative BIAS supply circuit.

(2) Thus the previous stage Zout is not the 'only resistance'.

(3) Apparently the ability to charge the input Cap without the complimentary ability to discharge it is what you and others have been referring to as "blocking" or "blocking effect/distortion". Like a stuffed up nose, current is stifled, and resistance rises rapidly.

(4) This is only relevant if one has a blocking cap (normally to keep out DC from the previous stage). It may be a better argument for direct or resistive coupling than against class AB/B operation.

(5) the "solution" of increasing the grid-stopper resistance as a 'cure' does appear ridiculous, and frequency-dependent too.

(6) As grid current is diverted into ground instead of back into the previous stage (now blocked by a full cap), the stability of the time constant seems irrelevant, because current flowing through the grid-leak resistor to ground drastically alters the BIAS, causing a potential runaway condition.

(7) When such loss of control over the current flow occurs, even loss of driver voltage becomes irrelevant as well.

(8) I think its "not usually done" for more serious reasons than mere drive-signal attenuation.

(9) The advice to avoid overdriving (i.e., crossing the 0-bias line) or redesign the circuit is great advice, but the steps need explicit expression.

(10) I would guess that both control-loss, undesirable current and voltage changes, and runaway tubes would make proper driver/output stage interfaces mandatory.
 
Well, yep I am coming from a 'hifi perspective'. And I love Jazz.
The goal for me is a really musical guitar amp.
If that means it ends up "outside the box" as far as previous guitar amps and experience goes, oh well, I won't be heartbroken.

This is a very entertaining thread, and I'm going to admit right off the bat that I haven't read every single post in this thread yet, but this one caught my eye. Right from the thread title my first thought was "Ultra-linear guitar amp? I can't imagine that will sound very nice.", and then I saw the quoted post. Isn't all of what makes tube amps "musical" and "pleasing to the ear" the fact that tubes are inherently non-linear? 2nd order harmonics are the key to great tube sound, especially in guitar amps. If you make a hifi style, ultra-linear guitar amp aren't you going to defeat a lot of that character?
 
As I asked, which is musical enough? Heck of a lot of good music out there made with Fender amps. Which sounds metalic and harsh, the Tweeds or the Blackfaces?

I guess that would be the Twin.


Well, the bass on this sounded pretty good, but the whole treble/upper mid was just noisey and distorted in a non-musical way.

Here's where the guitar belongs, far below the harmonica,
and with the treble properly turned off, when you're using
a crappy amp like that:

[yt="How Many More Years"]4Ou-6A3MKow[/yt]

http://www.youtube.com/watch?v=A1FK620bS7A
I know of musicians that actually make them last as long as a month. Imagine that?

I know a lot of repairmen who've had to modify hundredss of dangerously crappy Marshalls, to stop them eating EL34s like cheeseys.

Don't care for the Beatles much myself.
18watt EF86 custom combo amp - YouTube

Well,

You are playing the wrong amps.
Well you got a great tone finally on that last amp/link recording.
Rare for a Youtube vid.


Well there is hope for you.

Well since you were saying you have not hear a tube amp you liked...
except my own, it goes without saying...
But then, I haven't built any guitar amps.

I've wasted a lot of my youth also.
it goes without saying...

Not sure you understood me. Grid stoppers are used to control the frequency response and the harshness that can happen when you overdrive the grid. Not sure what you mean by tested Class AB/A performance.
Okay, so you envision grid-stoppers being selected and used
as treble-controls.
That's quite interesting too in its own way.
Again, I'd like to get to see some examples of commercial units designed that way.
I don't doubt that private experimentors have tried every design philosophy possible.


I think I will enjoy seeing what you will come up with.
Thanks for the advance vote of confidence!
 
Last edited:
Thanks for the Soldano example.

However, I think what we see here is what I talked about earlier in the thread:
Doing the tone shaping BEFORE the power stage.

All he's got on the grids of the Power tubes is 2.2k stoppers.
Nothing really like 'tone control by choice' there.

I certainly didn't deny that distortion/overdrive/compression is carefully crafted in guitar preamps, and apparently Soldano spent a lot of time manipulating grid stoppers in his.

But was talking about a method of selection of OUTPUT tube grid-stopper based on Tube choice (aiming for a flat audio response with low distortion).

At this point, we seem to have gotten sidetracked into a discussion about the merits of capacitor-coupling a driver stage prior to the Output, due to issues with grid-current in positive swings.
 
Yes he is doing the shaping in the preamp stage, it was more for an exercise.

That particular schematic produces bucketloads of high frequency harmonic content, so there is no shortage of treble despite the aggressive measures.

The Peavy Classic 30 has 47K gridstoppers.

I doubt you will hear much of a difference in gridstopper values until you reach the point where it starts to cut into the audible treble range. These are relatively low values, series connected.

Have a look here for some handy math.
The Valve Wizard

The preamp stages have different Miller capacitance therefor need different values. At the end of the day it is the same result, similar levels of RF attenuation despite being different values for preamp vs power tubes.

So just use one high enough to attenuated RF at a desired level.
Go higher if your really going to push the amp dirty or wire like spaghetti.
 
Just as an aside, what do you guys think of adding a rectifier-tube,
not for rectification (I would leave the bridge in), but as a 'pass-tube'
just to drop some voltage and add a bit of sag to the powersupply rail?

I know that sounds counter-productive but if there were some sonic advantages
(could it help in an overdrive situation? and with this PS transformer,
I have too many volts to risk driving it into class B2/cutoff with say 5881s...)

Also, maybe choking out the HV supply with a rectifier tube
would lower risk of excessive screen-grid currents, without resorting to bigger SG resistors,
that might harden the sound and remove the sought-after 'bounce'.
 
Last edited:
What, like a passive tube? Not sure what you mean, have the tube rectifier serial connected with a SS bridge?

At the end of the day the ultimate fix for over voltage, is too instal a Variac before the power transformer, and use a separate heater transformer.

Then you can dial it into whatever voltage you want, whenever you want. Even step it up 20% or so, depending on the model.
 
What, like a passive tube? Not sure what you mean, have the tube rectifier serial connected with a SS bridge?

Yes. A "pass-tube" meaning, not 'passive', but passing the current to the amp.

At the end of the day the ultimate fix for over voltage, is too instal a Variac before the power transformer, and use a separate heater transformer.

Then you can dial it into whatever voltage you want, whenever you want. Even step it up 20% or so, depending on the model.

Well, that works great for your test-bench.
But I have never seen a schematic with a variac built in.
Kinda cute idea though!

But if you're gonna do that presumably for line-adjust,
why not automate it, so the user can't make a booboo?
 
See the 57 Deluxe RI schematic.
Ok I'll have a look.
Do you have a link?

Not sure how using resistors to drop the voltage will harden the sound though. The tube resistance causes a voltage drop. The resistor's resistance causes a voltage drop.

In regard to screen resistors,
I've read a lot of internet discussions on it,
and some of them actually had some 'meat', i.e., somebody actually tried different values in a commercial amp or two.

What I read left me with this impression:

(1) Although higher resistors limited screen current, something happened to the guitar 'sound'. SR values higher than 600 ohms made the output stage "too stiff", harsh and non-musical in the judgment of the players.

(2) This was thought at first to be some kind of distortion, but turned out to be the opposite! Specifically, the 'attack' was too sharp, possibly reproducing the dismal performance of the pickups or previous stages. No clear results could be given in regard to the 'decay' of a note, but since that was less important it was less noticable.

(3) The cure was to go back to lower screen resistors, to preserve a 'softer' attack, but then make some other adjustment (presumably fixed bias adjustments or lower B+) to protect or enhance tube life.

I was really interested to find out that this approach (increasing screen resistors) was effective from an engineering standpoint but was disastrous musically.

Another case of solving a problem without regard to the overview and/or preserving some desirable flaw.
 
Well, that works great for your test-bench.
But I have never seen a schematic with a variac built in.
Kinda cute idea though!

But if you're gonna do that presumably for line-adjust,
why not automate it, so the user can't make a booboo?

The main reason is because it is cost prohibitive. The name escapes me at the moment, but there is at least one modern day manufacturer of a boutique guitar amp with a built in Variac.

Also once it's used to dial in the sound for a particular genre, the power transformer can be custom wound to simulate the voltage.

Some nice sound can be had by cutting the line voltage to ~90v from ~120v
Eddie Van Halen - Wikipedia, the free encyclopedia

I suppose a Variac could be modified with mechanical stops or possible by removing windings to limit a users adjustment.
 
Ok I'll have a look.
Do you have a link?

I normally Google when I want it.

In regard to screen resistors,
I've read a lot of internet discussions on it,
and some of them actually had some 'meat', i.e., somebody actually tried different values in a commercial amp or two.

What I read left me with this impression:

(1) Although higher resistors limited screen current, something happened to the guitar 'sound'. SR values higher than 600 ohms made the output stage "too stiff", harsh and non-musical in the judgment of the players.

(2) This was thought at first to be some kind of distortion, but turned out to be the opposite! Specifically, the 'attack' was too sharp, possibly reproducing the dismal performance of the pickups or previous stages. No clear results could be given in regard to the 'decay' of a note, but since that was less important it was less noticable.

(3) The cure was to go back to lower screen resistors, to preserve a 'softer' attack, but then make some other adjustment (presumably fixed bias adjustments or lower B+) to protect or enhance tube life.

I was really interested to find out that this approach (increasing screen resistors) was effective from an engineering standpoint but was disastrous musically.

Another case of solving a problem without regard to the overview and/or preserving some desirable flaw.
The screen resistors limit the current to the screens and more importantly reduce the voltage on them so when the plates drag down the voltage on the OT the screens do not become higher than the plates. Since you are running ultralinear the voltage on the screens drop more less in relation to what the voltage to the plates are doing. At least from my understanding, I do not know enough about ultralinear operation to design an output stage.

On regular guitar amps screen resistors are a compromise. They protect the output tubes from the likes of tube eating Marshalls.


Oh, and not that I know much about it also, but larger screen resistors would cause the voltage on the screens to drop more than lower resistors. And the gain of the tube is less when the screens are run at a lower voltage which would cause me to believe that the larger screen resistors would cause a compression effect.
 
Last edited:
...

Some nice sound can be had by cutting the line voltage to ~90v from ~120v
Eddie Van Halen - Wikipedia, the free encyclopedia

I've been told repeatedly that this was one of Van Halen's (off the cuff interview) pranks.

That is, he really didn't use a variac,
unless perhaps experimentally in studio once, and its not a part of his 'sound'.

Those running out to try this ought to take serious heed of the obvious warnings, such as:

(1) A variac on most amps would also drop the heater voltage,
and this directly contributes to tube death.


So much so that RCA did a long comprehensive study
and warned designers not to vary the heater voltage
from recommended values more than 4%:
LOWER VOLTAGES especially KILL TUBES QUICKLY.

Heater%20voltage%20affecting%20valve%20life.gif


With 85% of the rated voltage on a tube, it goes from a 5,000-hour+ tube to a 3-hour tube!.
Letting your tech experiment with a variac on your amp for a couple of hours will cost you a whole set of tubes within weeks of installation!

Don't let the heater voltage on your tube drop below 96% for any extended length of time:

12.6 v heater (signaltubes): greater than 12.1 volts or else! (Never use 12 volt regulators (7812) without resistor adjusts)

6.3 v heater (powertubes): greater than 6.05 volts or else! (Never use 6 volt regulators (7806) without resistor adjusts)


Thus rather than a Variac, for most amps, we really want an ANTI-Variac (surge/voltage regulator) that provides correct heater voltage regardless of MAINS or LINE INPUT.
To do the opposite is to destroy tubes, drastically shorten tube life, and is only sensible if you are an idiot millionaire.

"cutting the line voltage to ~90v from ~120v" may give you some 'nice sound',
but it would cut your heater voltages to about 75%,
and for any length of time, cut your tube-life expectancy to only a few hours!

You can get the same results without destroying your tubes several other ways (which we'll explore later)
 
Last edited:
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.