• WARNING: Tube/Valve amplifiers use potentially LETHAL HIGH VOLTAGES.
    Building, troubleshooting and testing of these amplifiers should only be
    performed by someone who is thoroughly familiar with
    the safety precautions around high voltages.

How hot is "too Hot" for a transformer?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
I've just gotten this Heathkit AA-121 to the point of being functional (replaced selenium rectifier and the coupling caps). After approximately 1.5 Hours of usage, the power transformer is what I would describe as "uncomfortable" to touch for more than 10 seconds or so. The output transformers are cooler, but still very warm. I know that Stereo 70 power transformers run very hot (not sure about the outputs), but I was wondering if anyone can give me some feedback if A) this is typical for this amp and B) if I should be at all concerned.

The Power tubes are Electro-Harmonix EL34 (yes, I was a cheapskate and bought a $45 set from eBay "Unused without retail packaging", though the seller was reputable with lots of good feedback).

I have not yet replaced the main filter caps, but they are only a bit warm to the touch (much cooler than the chassis and transformers), and I suspect that they are being heated by the tubes which are pretty close to them.

The tubes are biased at 1.25V on the cathode (.25 less than what the heathkit manual recommends). It was a bit hotter with them at 1.5 volts. Each tube has a 6 ohm resistor (I used 1W wirewound), and those both connect to the same 12 ohm resistor, so 18 ohms total. I calculated that I'm running around 16W per tube (35 ma or so), which I know is pretty cold for an EL34.

Please correct me if I'm wrong (relatively new to tube amps), but I feel like with the tubes biased this way, I shouldn't be getting this much heat from the transformers.

Would adding 100 ohm 1W resistors on the screens be a good idea? Output transformers for this amp are rare as hens teeth, so it's definitely preferable to extend the life of them.

One more thing- I eliminated the goofy phase switching feature on the A channel speaker outputs because it had a bad switch. I can't image this causing any problems, but just so it's known. Both channels are wired the same on the outputs.

Thanks for all the help. There isn't a whole ton of information on these amps out there.
 
Last edited:
I doubt that 'uncomfortable to touch' is going to be a concern for the transformer itself.

Your concern can be more closely approached from two angles. (a) assessing the internal temperature of the winding and insulation is concerned about the ability of the transformer to operate safely. And (b) measuring the external touch temperature of the surface of the transformer is concerned about your safety when near the amplifier.

Internal temp is formally measured by either inserting a thermocouple in to the winding, or space between winding and core etc, or by a measured resistance method (you can look that up). In general, the winding is allowed to get to at least 150degC for the lowest class A type of insulation used, and could exceed 200degC if you have class H - of course the problem with vintage transformers is the doubt as to what insulation they used, and whether the transformer has suffered stress over the decades which may have degraded that insulation.

As a user, you may be able to touch the transformer, so it is a hazard if it is hot. That issue mainly relates to whether you would normally be required to touch the part during operation or whether it could be touched, or whether it has warning signage of being a hot part (eg. an external heatsink normally has a hot hazard label). The allowable temperature rise of the transformer surface in your case is likely to be at least about 45C above ambient - which I'd suggest exceeds your "uncomfortable' level :)
 
Last edited:
I've just gotten this Heathkit AA-121 to the point of being functional (replaced selenium rectifier and the coupling caps). <snip>

For most people if the surface is over about 130 F they can only touch it for a few seconds before they feel like they have to pull their hand away. So if you can touch yours for 10 seconds or so it is highly unlikely the temperature is even 130 F. And even cheap transformers can work fine at temps well above that. Also much of the heat is simply heat that is being absorbed due to the transformer being in close proximity to hot tubes.

Your power transformer sounds fine to me.

And the output trafos don't generate much heat - so what you are feeling on them is likely just absorbed from hot parts/tubes around them.

One last thing - IMHO if those can caps are originals they are on borrowed time. Replace them ASAP to prevent future problems. I just don't trust 50 year old caps.
 
Would adding 100 ohm 1W resistors on the screens be a good idea? Output transformers for this amp are rare as hens teeth, so it's definitely preferable to extend the life of them.
I'd suggest the first level of protection is to add a secondary HT fuse, and determine what diodes are in the HT supply as they would be early vintage and may be suspect over time. Adding a fuse requires information on the transformer winding resistances, and a little bit of design effort, which can be done in this thread if you are keen.

Yes adding a screen stopper may reduce stress on the output transformer for certain tube failures.

One more thing- I eliminated the goofy phase switching feature on the A channel speaker outputs because it had a bad switch. I can't image this causing any problems, but just so it's known. Both channels are wired the same on the outputs.
I would agree that that 'feature' becomes a risk over time, as the output could become unloaded from a bad switch.
 
do the 10 sec test...if you are able to stand the heat with your both hands at the traffo, then that is not so bad at all....

bust best to get yourself a handheld pyrometer,
taking note of your ambient as reference point and then pointing to your traffo to take the actual temperature....
you will then get the temp rise, i.e. actual reading minus ambient temp reading.
you can then ask the manufacturer of your traffo what the temp rise spec is...
this will help you decide if you have issue or not...
 

Attachments

  • portable-hand-held-pyrometer-500x500.jpg
    portable-hand-held-pyrometer-500x500.jpg
    12.7 KB · Views: 639
The tubes are biased at 1.25V on the cathode (.25 less than what the heathkit manual recommends). It was a bit hotter with them at 1.5 volts. Each tube has a 6 ohm resistor (I used 1W wirewound), and those both connect to the same 12 ohm resistor, so 18 ohms total. I calculated that I'm running around 16W per tube (35 ma or so).

Please correct me if I'm wrong (relatively new to tube amps), but I feel like with the tubes biased this way, I shouldn't be getting this much heat from the transformers.
The 12 ohm Cathode resistor is shared between two tube so the effective resistance is more like 24ohms. You are drawing about 42mA per tube.
 
Maths is not my strong suit, but I like ohms law.

I see it like this, imagine the two EL34 cathodes are strapped together. Then you have 12 ohms + 3 ohms (2 x 6 ohms in parallel) = 15 ohms.

The normal cathode volts setting is 1.5 volts, therefore the 'shared' current is 100ma or 50ma per valve / tube. (Gets close to 24 watt dissipation per valve at 490v HT/B+.)

Set cathodes to 1.25 volts, and shared current is approx' 84ma or 42ma per valve. So about 20 watts each.
 
Member
Joined 2015
Paid Member
On another thread I found a simple empirical test that works for any appliance with mains operated motors or transformers, because copper resistence will increase with a known rate with temperature increase. Turn on the power switch of the cold appliance and measure the resistence between the contacts on the mains plug. This is roughly equal to the transformer or motor coil DC resistence when cold. Plug the appliance and wait for a while, then unplug it and measure the resistence again. This value is the "hot" winding value. As long as the hot resistance is not more than 20% increase on the cold resistance, you should be be OK. This figure corresponds with a temperature increase of 50 deg C of the winding, which is a pretty good representation of the hottest part of the transformer.
 
If it's so hot to falls though the bottom of a plastic enclosure, it's still fine lol



Older lighting ballasts can be tested with a match. That's ridiculously hot. It's usually the connections that fail on such items, rather than the winding. Or the core delaminates after years of baking the shellac. Causing buzzing that wears till the lamination's contact and eddy currents apply too much load. Hopefully taking the fuse.



These things can cook! Boiling point is nothing to them
 
If it's so hot to falls though the bottom of a plastic enclosure, it's still fine lol



Older lighting ballasts can be tested with a match. That's ridiculously hot. It's usually the connections that fail on such items, rather than the winding. Or the core delaminates after years of baking the shellac. Causing buzzing that wears till the lamination's contact and eddy currents apply too much load. Hopefully taking the fuse.



These things can cook! Boiling point is nothing to them

Recently I've built a psu with fluorescent ballasts as chokes in a CLCLC configuration. The first ballast - closer to the rectifiers - is not something you would like to touch! Initially, I thought to watch it and see how long it could go before replacing it but more and more I'm convinced that this won't be needed.
 
On another thread I found a simple empirical test that works for any appliance with mains operated motors or transformers, because copper resistence will increase with a known rate with temperature increase. Turn on the power switch of the cold appliance and measure the resistence between the contacts on the mains plug. This is roughly equal to the transformer or motor coil DC resistence when cold. Plug the appliance and wait for a while, then unplug it and measure the resistence again. This value is the "hot" winding value. As long as the hot resistance is not more than 20% increase on the cold resistance, you should be be OK. This figure corresponds with a temperature increase of 50 deg C of the winding, which is a pretty good representation of the hottest part of the transformer.

Thanks, that's pretty useful to know. I have a feeling that the CL-90 thermistors (2x in parallel, had them on hand) might affect that method though.
 
Recently I've built a psu with fluorescent ballasts as chokes in a CLCLC configuration. The first ballast - closer to the rectifiers - is not something you would like to touch! Initially, I thought to watch it and see how long it could go before replacing it but more and more I'm convinced that this won't be needed.

I'm intrigued by the prospect of using fluorescent light ballasts as chokes. Tell me more about this. Would they work for a standard tube amp supply (say 425V @ a few hundred mA)?
 
Very generally, tubes plus internal losses heat up transformers, which heat up the chassis and components underneath. Tubes can also heat up outside electrolytic capacitors, which is perhaps the component most sensitive to heat.

Typically the glass envelope of a power tube can reach (C/F) 230°/446°. Transformers can take >100°/212° quite well.

[There is someting about a power transformer though with which I once had a bad experience. Particilarly with thick stacks, the 'through-bolts' need to be electrically insulated, particularly at their ends (insulating washers). I once experienced such bolts rapidly reaching several 100°C! By that time I had assembled several dozens of power transformers over the years; no similar experience before!]
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.