• WARNING: Tube/Valve amplifiers use potentially LETHAL HIGH VOLTAGES.
    Building, troubleshooting and testing of these amplifiers should only be
    performed by someone who is thoroughly familiar with
    the safety precautions around high voltages.

Radford bias: options for ultralinear output

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Member
Joined 2014
Paid Member
One of my daft questions. I really shouldn't try and think when been up all night changing nappies, but thoughts turned to the Radford STA-25 that self destructed one chicago winters evening in 1996 and really needs to be bought back to life. Every so often I do some research, come to a decision, park it through lack of funds and 6 months later pick it up again.

So last night's thinking was about bias and I decided to start by looking at LED bias by reading SY's fine red light district articles. Looks good, but runs into problems with ultralinear connections (I rewired the radford to triode back in the 80s when running efficient speakers, but on the rebuild want to go back to UL). So possibly back to drawing board...

At this point thought I had better check how Radford did it back in the 60s and this is where I really should have waited until I had had some sleep. This is very confusing as it appears to use BOTH cathode and fixed bias. On the cathodes is a 39Ohm resistor bypassed with 250uF, 1M grid leaks to ground AND a negative bias on the control grids. Biasing is done by setting 2V across the cathode resistors (to give 50mA). Never seen this before so can only assume Radford didn't trust users to set the bias accurately and put a belt with the braces. But unless someone can illuminate me does seem to be worst of both worlds?

The later renaissance redesigns by woodside DO use a standard fixed bias setup (and 6550 instead of EL34), but have a screwy measurement setup that requires someone to have correctly set pots in the factory! I have seen references to a fully active bias setup that Morgan Jones did, but don't have his book yet to see if that has wings.

Which leaves me back at square one and wondering which way to go. The PCBs are too flaky to resuse so the restoration will be new PCBs which gives me the freedom to do anything I want as long as it fits in the case and doesn't require new iron.
 
If you want the lowest distortion figures, you need to use fixed (grid) voltage bias. But that will not self compensate for tube aging as cathode resistor bias will, leaves the tubes liable to self distruct if the bias supply fails, and can result in a tube ran-away condition.

Resistor bias is safe and reliable, but as the bias then tends to increase with signal level (due to grid voltage/anode current transfer curvature), results in a little more distortion and poorer overdrive characteristics. Whether you can actually hear the increased distortion in a carefully designed amp is another matter.

However, tube manufacturers always recommended that some cathode resistance be included. A volume manaufacturer will always comply with tube-maker's recommendations. Then if a batch of amplifiers have an unusually large number of tube failures, there is no debate about whose fault it is, and why it happened. In large scale manaufacturing, and engineer who thinks up an innovation using a tube in an unusual way would always ask the tube manufacturer to comment. Otherwise he leaves himself open to getting the sack if failure rates are high. Or lawyers sucking out the profits of everyone.
 
Member
Joined 2014
Paid Member
However, tube manufacturers always recommended that some cathode resistance be included. A volume manaufacturer will always comply with tube-maker's recommendations.

Understood, but this is all the elements of cathode bias (cathode resistor, bypass cap and grid leak to ground) with fixed bias on each grid. Not seen it before so assumed it was something special to Radford. The Mk2 was almost a direct copy of the williamson so used cathode bias.
 
Member
Joined 2014
Paid Member
Why does LED bias not work satisfactorily with UL?

You still need to adjust bias current by another means. SY used g2 on a pentode for this, which isn't available, so you need to have a control grid bias on each tube, which means you have to have fixed bias anyway. So there is the question of if the benefits outweigh the complexity. I've not done the sums but am assuming if done right you use the LEDs to get to a safe low bias then only have a couple of V adjustment needed.

Or I have missed something and got confused which is most likely!
 
Member
Joined 2014
Paid Member
Been thinking about this a bit more and looking at some of the autobias solutions offered such as the Curcio and tentlabs offerings. Also much better understanding why some swear by cathode bias!

The big negative for grid bias other than parts count (all cheap parts tho) is that there is a nasty failure mechanism. Did get me wondering. If you used LEDs to bias at a higher potential than the nominal operating point (i.e. a safe operating level with low Ia) you could then use a relatively small positive grid bias to get you to the operating point.

At first inspection this seems awful, say biasing to -40V with the LEDs then reducing that bias voltage with the grid, but at least it has some degree of failsafe. Although a protection circuit that trips power on bias fault would be less components...
 
The thing is - what do you hope to GAIN by adopting a more complex biasing scheme (counting any protection/backup/auto shutdown circuit as part of the complexity)?

What do you hope to gain, specifically:-
1) Less harmonic distortion?
2) A less objectionable overdrive characteristic?
3) Complete freedom from post-overdrive paralysis?

Clearly, you LOOSE in regard to reliability, serviceability and cost.

Let me be clear: I'm not dead-set against fixed bias, or shutdown schemes. What I'm saying is that you should be clear in your thinking about why you want to go for something more complex than simple cathode resistor biasing.

1. Distortion: While the lower distortion possible with fixed bias is measurable on instruments, many people cannot hear the difference. And the difference can be made up elsewhere, or lost, by subtle design issues with negative feedback (zobel networks), by not using an optimal UL tapping for the tubes, etc.

2. Overdrive characteristics: Again, carefull attention elsewhere can improve this, and non-careful design or ignorance can make it worse. In guitar and PA amps whereoverdrive characteristics are vitally important, carefull attention to the driver can do wonders.

3. Post-overdrive paralysis: Even with cathode resistor bias, sensible design can make this a complete non-issue.

So, what do you hope to gain?


There have been amplifier designs that employ a microprocessor to sense the long term drift in tube characteristics, and adjust grid bais voltage accordingly, with a rate of change restricted to mV per hour. And once you have a micro, you can design the system so that any loss of the bias supply or microprocessor function shuts down the amplifier.

You then have the ultimate: The distortion and overdrive advantages of fixed grid voltage bias, plus automatic compensation for tube aging and tube unbalance, without the risks.

But have I ever considered doing this? No - it's just not worth my while. I'd rather spend the money on a better output transformer.
 
Last edited:
Member
Joined 2014
Paid Member
Well right now I'm still deciding on the path I take to get it running again. 17 years is quite long enough to be malingering. I could just spend money, buy Mk4 PCBs populated and fit them. I would then feel dirty and lazy, but it would be working. So would rather redesign within the limits the packaging and iron and see how much (if at all) better it could be made.



The STA-25 was not shabby by any standards of the day with 0.1%THD at 25W and 1% at 36W. These days I am sure someone would advertise it as a 40W amp! Before it died it provided many years of very satisfying music. It has some issues

1. too much gain (easily fixed)
2. funky splitter needs matching (easily fixed)
3. built to safety standards of the time (less easily fixed but can be helped)

Really just throwing out for discussion. Not touched tube circuits since it died so going back to first principles on everything and seeing what comes out. And right now I don't know where the limits are on this design.
 
Member
Joined 2014
Paid Member
A copy of Morgan Jones (3rd edition) arrived last night and on a first very quick scan he has one design that is very obvious when you look at it, which is to use the driver stage DC coupled to supply bias. Other than requiring a negative HT rail (which i sadly would have huge trouble fitting in the metalwork I have) is neat.

More thinking required...
 
Unless there is some form of DC feedback to stabilise opertating conditions, DC coupling the driver stage is NOT a good idea. And even if there is DC feedback its not a good idea even then. Unless there is a method of making the DC loop gain much greater than the AC loop gain - and that of course means a capacitor affecting the signal path somewhere.

There is not much point in DC coupling a tube amp anyway, if there is an output transformer.

To ensure that the amplifier is stable at the low frequency end of the audio range, and to eliminate non-linear distortion at the low end (where it matters the most), the output transformer should not be what determines the amplifier low frequency cut-off. That means it must be set somewhere else - a grid coupling capacitor.

All well designed tube amps have their LF cutoff set by the coupling capacitor betwen the driver (or phase splitter if that is also the driver, as is most often the case in quality audio) and the output stage. That a) ensires stability, b) minimises transformer distortion, and c) furnishes a good overdrive characteristic and freedom from post-overdive paralysis.

The output transformer, all other capacitors within the negative feedback loop, and all cathode bypass capacitors, should be selected so that their LF rolloff is much lower than the output stage grid coupling capacitor. So you must actually have a coupling capacitor.

No tube mananufacturer ever recommended DC coupling in an audio amp. Because with it, drift of driver characteristics as the driver tube ages matters a lot. A heck of a lot. Without DC coupling, it really doesn't matter at all, untill the tube's emission has juts about completely failed.

While DC coupling is good in solid state amps (transistor aging is entirely negligible) and saves money, in a tube amp it's stupid.


While I don't have Morgan Jones's book, I am familiar with his writings/ramblings in various adio/electronics magazines, and from excerpts published here and there. He is not a guru. As far as tube audio is concerned, despite being well known and selling quite a few books, he is an amateur.
 
Last edited:
Member
Joined 2014
Paid Member
He does address all of that. I was not presenting him as a guru, just as an interesting option for further study. He addresses all of the points you have listed including exceedingly good overdrive and recovery and it appears to be a valid option for consideration if you can deal with the tube count and need for a negative supply.

Should also not there is no global negative feedback.

It's an option to consider. The increased cost over the std textbook designs means it might be discounted, and others will write it off just because it uses silicon CCS but it is not a 'bad' design on first inspection. It may not offer any huge benefits over methods that were first implemented in the 40s but I don't see a need to dismiss it the way you have.

I am interested in any references to how you deal with overload with cathode bias schemes though. Been hunting around and not found anything yet.
 
He addresses all of the points you have listed....

Perhaps.

Should also not there is no global negative feedback.
Oh, right.

It's an option to consider. The increased cost over the std textbook designs means it might be discounted,
Damm right its' discounted. The money is better spent on the output transformer, or on the loudspeaker where it can actually do some good.

Spending money on something that does NO good is stupid.

and others will write it off just because it uses silicon CCS
Ah, I thought that might be the case. I thought we had already shown why that is supid.

How has he prevented the CCS(s) from detroying forward gain then? By use of a bypass capacitor presumably. So there is a capacitor or two in the signal path and it is really a bad form of AC coupling after all.

It may not offer any huge benefits over methods that were first implemented in the 40s but I don't see a need to dismiss it the way you have.
That is precisely why I dismiss DC coupling in tube amps. No benefits at all, and costs more.

Note that DC coupling in tube amps is an idea that came in the 1920's as an alternative to transformer coupling made possible by the development of comparitively high gain tubes (the earliest tubes had very low gm and cost a lot of money, so step-up coupling transformer were used to get enough gain). And when almost immediately when engineers realised you could use a marvelous new low cost linear device - a coupling capacitor.


I am interested in any references to how you deal with overload with cathode bias schemes though. Been hunting around and not found anything yet.

This is something that isn't covered well in recent (last 20 years) books, but was well understood by engineers who worked in the days prior to solid state.

Overload problems come with cathode resistor bias due to a rise in cathode voltage causing overbias and even tube cutoff. The rise comes form tube charactiristic curvature, and can also arise from too low a transformer primary impedance. Or too low a transformer primary inductance (which will cause other problems).

If each cathode of the push-pull tube pair each have their own cathode bias resistor, and the cathodes are cross-coupling by a capacitor, the problem is completely removed. Cross coupling can be done with a single large non-electrolytic, or by two series connected back-to-back electrolytics with the midpoint (cap negative) grounded via a large value resistor.

Poor overdrive characteristics and recovery problems can also arise if the driver-output tube grid circuit isn't sensisble. A big part of why this isn't covered in textbooks is that what was sensible to old-time circuit engineers was also obvious to them. But not to modern folk like MJ it seems.

The problem comes about when the coupling capacitor is charged by grid current. Grid current occurs when tubes are overdriven positive. Obviously if there cannot be enough drive to cause grid current, there cannot be a problem. This is the approach taken in the best guitar amps enginnered by long standing reputable manufacturers. The driver is designed to overload at the same time or just after the output stage - there then cannot be any grid current.

The problem is exacerbated if the coupling capacitor is too large. As I said before, the amplifer LF cutoff should be set by the driver/output stage coupling cap - all other caps set for much lower rolloff. In otherwise sensibly designed amps this will eliminate audible problems in reproduction as the cap will then be somewhat small. Guitar amps need more thought.

The problem is exacerbated if the coupling cap charge path is higher reistance than the discharge path. This will be so in a basic circuit becasue the charge path involves a forward biased output stage grid and in discharge the grid is reverse biased and is blocking. The solution is a resistance in series with the grid. Modern folks think this is a "grid stopper" for stopping oscillation. It does stop oscillation, and with large output pentodes is essential for that reason. The value required to stop oscillation is quite low -perhaps as low as 10 Kohm or even 2 kohm. But if it is made as high as 100 kohm or more, the improvement in overdrive behavior from reducing grid current can be audibly obvious.
 
Last edited:
Member
Joined 2014
Paid Member
I never said the AMPLIFIER was DC coupled, just the driver to output. The capacitors are between the input differential pair and the driver. Every decision is justified and therefore valid for consideration even if it would not be cost effective for a production product where more is spent on the blingy front panel than the parts.

Understand what you say on cathode bias, but don't get how that deals with overload recovery when you have to discharge those cathode bias capacitors?

not sure if you have a thing against Jones or a thing against silicon, but hey, valve amps for hifi are a daft thing and different views are good :). Guitar amps are not my thing so designing something to be overdriven on purpose is not something I worry about.
 
I never said the AMPLIFIER was DC coupled, just the driver to output. The capacitors are between the input differential pair and the driver.
My comments apply to DC coupling between the driver and the output stage.

Every decision is justified...
If you say so. It must be hard to justify something that costs more and offers zero improvement.

... and therefore valid for consideration even if it would not be cost effective for a production product where more is spent on the blingy front panel than the parts.
There is no doubt that front panel appearance matters in retail products. I actually have abackground in professional electronics - recording studios, radio stations, and the like. In the audio products in this market, the electronics inside is impecable, but the outside appearance is battleship grey painted mild steel.

But I also have experience in amplifer engineering for the retail market. In this market manufacurers employ and have pretty much always emplyed two distinctly different people:-

a) electronic engineers (like me) whose job it is to engineer the circuit - by engineer I mean getting the best possible performance for the least amount of money, with the cost/perfomance tradeoff appropriate for the traget market;

b) industrial designers, whose job it is to design the external appearance to secure maximum attactiveness for the least amount of money, - with a cost/attactiveness appropriate for the target market.

As an engineer, I don't make any claim to be able to design, nor do the designers I know claim to be able to engineer. But I do know that the cost pressures are just as much on them as they are on me.


Understand what you say on cathode bias, but don't get how that deals with overload recovery when you have to discharge those cathode bias capacitors?
The discharge path is determined by the cathode resistor. That's a limitation, and from this point of view alone no cathode resistor at all (ie fixed grid volatge bias) would be better.

The thing is, any form of const current biasing is higher impedance and therefore worse.

Also, in sensibly designed amplifiers (sensible grid circuits, correct output transformer), the variation in cathode voltage you do get with cathode resistor bias causes distortion so low that most people cannot detect it.

not sure if you have a thing against Jones or a thing against silicon, but hey, valve amps for hifi are a daft thing and different views are good :).
1. I consider Morgan Jones best ignored. There are plenty of authors who actually DID know what they were doing. Neville Thiele wrote authoritively on this very topic, for example.

2. In my day job I have always engineered in silicon. But I get personal satisfaction out of tube-based equipment. Building a good tube amp will always cost more than a good solid state amp. But that does in no way justify being silly, and spending money of things (eg CCS biasing) that do no good and even make performance worse.


Guitar amps are not my thing so designing something to be overdriven on purpose is not something I worry about.

Of course, so long as you realise that even in reproduction of recorded music, virtually all domestic amplifiers are overdriven part of the time. This is due to the dynamic range of music, and its high peak-to-average power ratio.

At volume levels not particularly loud, and amplifier, especially a tube amplifier) will always sound nicer if it overdrives cleanly without any recovery effects.

One of the things I like to do is demostrate an amplifer of 20 watt per channel rating, impeccable distortion performance, effiecient speakers, with an oscilloscope across the speaker terminals. Using a typical CD source, I ask people to tell me when it's clipping or distorting, as I slowly turn up the volume from nothing. People vary, but most don't react until clipping is obvious and frequent as shown on the CRO.

Then I demostrate on the same system a typical classical recording and a recording I select for its demands on amplifiers. I play them both at conversation level volume, not loud. The first plays perfectly. The second causes intollerable distortion, but is fine on a 100 watt system (same speakers).
 
Last edited:
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.