• WARNING: Tube/Valve amplifiers use potentially LETHAL HIGH VOLTAGES.
    Building, troubleshooting and testing of these amplifiers should only be
    performed by someone who is thoroughly familiar with
    the safety precautions around high voltages.

Testing emission without risk

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Testing the emission of tubes by the traditional method of connecting all grids to the anode and applying an accelerating voltage, typically 30 VDC or about 30V AC in cheaper tube testers, subjects the tube to a considerable overload and will quickly ruin the emissive surface. For that reason, tube testers usually had a button you had to hold down to do the emission test.

Today, good digital multimeters are readily available at a cheap price, so we can do accurate measurements that were not practical in the pre-transistor days.

Loss of emission, ie reduced cathode electron emissivity as the tube ages, generally involves subtle mechanical and chemical changes. Most of these changes can be expected to increase thermal emissivity. I got to thinking that this will mean that tube filaments will draw higher than normal current at low voltages. The electrical resistance of metals increases with temperature, so higher heat loss lowers resistance.

I have a large number of tubes of various sorts, ranging from good late production NOS to old worn out tubes taken from old radios and TV's.

I have found in testing that the heater current drawn at 25% of rated voltage is consistently higher for low emission tubes, even though the current at rated voltage is normal. This test was done with all grids and the anode left open. Typically, for a tube having emission 50% low (which in most circuits will still work normally), the ratio of heater current at 25% voltage to that at 100% voltage is about 8% higher.

So it seems that tubes can be screened for low emission by checking this ratio, which is completely non-destructive.

Has anyone come across this method before, or have any thoughts on it?

It is not mentioned in any books on tubes and tube testing that I have access to.
 
Last edited:
Hi,
It seems to me that this emission measurement in a safe manner can be estimated by using impulse method for measuring the anode current, as it is done in the "uTracer" tester. It is both safe for ranges far above maximum ratings of the tube: plate dissipation and anode currents in the range of saturation. Furthermore, these measurements can be made for the real, not reduced anode voltages.
 
Administrator
Joined 2004
Paid Member
I can't see how the increase in filament currents at low voltages you mention can equate to low emission in an indirectly heated tube, I think you would have to have measured the tube periodically over the course of its service life to determine whether or not there was an actual correlation or that the tube always exhibited those characteristics. I can think of a variety of other reasons.

Emission IMO is not usually a very useful parameter to test, and on a proper tube tester limited emission should show up in the course of testing transconductance.

I own a uTracer and bad / emission limited tubes are usually obvious as they do not perform comparably to a known good tube.
 
I've been using comparative emission to determine how much life is left in a tube.

Test it at nominal heater voltage and note the Gm. Reduce the heater voltage 10% and check it again. Gm varies very little in a new condition tube. It drops significantly in heavily used tubes.
 
Wow, an actual experiment with a correlation detected.
Thanks for doing this. Perhaps someone else can replicate the experiment.
I find this quite interesting too, and that is not something that I would dismiss lightly.
Unfortunately, I have only have very few directly heated tubes in my stock, and I would be unable to contribute to the statistics.
The idea does look plausible: after all, electrons are significant heat carriers in a number of materials (except here it should work in the opposite direction, but things are often more subtle than they look at first sight). Worth investigating, for sure
 
most tube testers i own have a "life test" on it; my B&K 700, for example. When the switch is activated ( from what i understand) it basically drops the filament voltage down a bit. If the meter reading drops quickly with the lower filament voltage then the tube is older. If it holds steady, it's good.
 
Err... Testing emission in the way suggested by gsmok can cause emission layer burns and physical removal of emission layer material in oxide cathodes, thoough it is acceptable in tungsten cathodes (only used in high power transmitter tubes).

The uTracer is intended to trace curves for the purpose of matching tubes and for obtaining detailed information to assist in designing tube application circuits. Pulse testing has long been accepted as the way to do this - it is how the tube manufacturers compiled curves. While you CAN program it to test emission with high voltages, this is not a good idea. Pulse testing can certainly keep anode and screen dissipation within what the anode and screen can handle, but it doesn't stop particle stress on the cathode, nor does it eliminate field emission, which is harmful to oxide cathodes.
 
Last edited:
I have found in testing that the heater current drawn at 25% of rated voltage is consistently higher for low emission tubes, even though the current at rated voltage is normal. This test was done with all grids and the anode left open. Typically, for a tube having emission 50% low (which in most circuits will still work normally), the ratio of heater current at 25% voltage to that at 100% voltage is about 8% higher.

Just for clarification, you are describing the application of (eg.) 1.57V to the heater and measuring the current to derive a resistance R1. And then applying 6.3V and deriving a resistance R2. And then finding that:

R1/R2 for a 50% low emission tube is ~8% higher than R1/R2 for a new tube. ?

Were these tests on a variety of power pentodes?

How were you measuring emission level for each test?

Were you able to plot results (measured emission versus R1/R2 ratio) for a statistically significant population (ie. >10) of the same type of valve - that would be interesting to see.

Ciao, Tim
 
Tim, you've got the idea spot on. I didn't actually calculate a resistance though. It is sufficient to calculate I1/I2, where I1 is the current drawn from 1.575V and I2 is the current at 6.3V, for a 6.3V rated tube. For low emission tubes, the I1/I2 value is about 8% higher in tubes with emission about 50% low.

Emission was tested in the standard manner described in the Radiotron Designer's Handbook, viz strap all grids and anode together, and apply 30 VDC. I actually tested emission at 120%, 100%, and 80% of rated heater voltage. As others have noted, weak tubes may pass at 120% and 100% heater voltage, and fail at 80%. Perversely, some pass at 80% and 100% and fail at 120%.

I tested only RF pentodes (so far). I only have RF pentodes in sufficient numbers from the various major USA, UK, and Australian manufacturers for statistical validity. I have sufficent power tetrodes and power pentodes on hand, but only from a few manuafacturers.

Note that emission testing has some value with RF pentodes and low level triodes and pentodes for audio, as it has some value as a life test. Such tubes are usually operated with a considerable margin of emission, so in-circuit performance is not affected by low emission unless its really bad. For audio power tetrodes and pentodes, a much better test is to measure the audio power output, as many textbooks (eg Radiotron Designer's Handbook, RCA book) recommend. If a power tube puts out full power without a lot of distortion, there can't be anything wrong with it.

When I have tried this for tube types other than RF pentodes, I may post all the numerical data if there is interest. But in the meantime I wanted to see if others had any thoughts pro or con. Different cathode alloys and variations in the oxide mix have been used for different tube types. So it is possible my results are only valid for ceratin tube types.
 
I assume you meant "....correlation between the electron emissivity of a surafce and its thermal emissivity?" Thermoelectric properties are something quite different - conversion of temperature gradient into electric potential & vice versa - to electron emissivity, though the underlying physics is related. And thermal emissivity covers all frequencies, not just infrared and visible.

That's a very good question of critical importance you've asked, DF96. A proper answer would be quite long a full of physics. I'll give a very short summary. If it doesn't satisfy, please say so, as I can give more detail.

It's not actually a question of thermal emissivity increasing as electron emissivity goes down. It's a question of how thermal emissivity changes with temperature. For typical oxide-coated cathodes, thermal emisivity changes smoothly from about 0.1 at room temperature to about 0.3 at design operating temperature (1050 K) - most things increase thermal emissivity as temperature increases. Most changes that cause electron emissivity to go down over the life of a tube leave thermal emissivity little changed at 1050 K, but increased at low temperatures.

Electron emissivity is given approximately by the Richardson-Dushman equation. I won't quote the equation - you can look it up, and there seems to be no way to write math equations in this forum. For any cathode material, there exists two constants you plug into the equation, along with temperature, to find what the electron emissivity is for that material at that temperature. If you do look up the RD equation, bear in mind that most books intended for training/education give only the form for a pure metal cathode. The correct equation for an oxide-coated cathode is slightly different.

There's actually two reasons for electrons leaving a hot surface - 1) thermonic emission, as per the RD equation, and 2) field emission, as per the Schottky equation. Ignoring things not relavent to receiving tubes of course, like secondary emission, light-stimulated emission. However, in receiving tubes and ordinary oxide cathode power tubes, field emission plays only a very minor part, especially if the tube is not overloaded.

There's three main causes of loss of electron thermionic emissivity as an oxide-coated cathode tube accumulates operating hours: a) gradual depletion of coating material, and b) cathode poisoning by gas, c) arc overs, if tube is very gassy or overloaded.

A); As cathode matetial is depleted (essentially, turned into gases, some of which coat other electrodes, and some which are absorbed by the gettering on the glass), the cathode coating, which is slightly rough to begin with, is left porus. You may remember from high school science that rough surfaces have greater thermal emissivity than do smooth surfaces. Porus surfaces have even greater thermal emissivity due to -in effect - they comprising lots of little "cavity radiators".

So cause (a) causes an increase in thermal emissivity, naturally more noticeable when temperature is low. This temperature-dependent change is enhanced by the change to shorter wavelengths of significant energy in black-body/grey-body radiation as temperature rises.

B) cathode poisoning, which is where material outgassed from anode, grids, and other parts, is ionised by collisions with electrons, ie turned into positively charged ions, is accelerated to the (negative) cathode, causing chemical reactions in the emissive layer. Since the cathode oxide layer starts off with quite low thermal emissivity (its white or light grey), you can naturally expect that such chemical changes have little "option" but to darken it.

C) does what arcs gnerally do to surfaces - result in little black or brown spots (burns that are thermionically inactive) on the cathode.

If my understanding of the physics is correct (and it may not be), cathode changes (a) and (b) may actually improve electron emissivity in the field emission mode - but as I said, field emission plays only a very minor role in ordinary tubes. In fact, if the tube is not a power tube working at its emission limits, field electron emission is just about negligible, even with a weak cathode.

I hope my explantion satifies you. To answer the question properly would take many pages of advanced physics. As with any simplification, some aspects can get distorted.

Keit
 
Last edited:
I forgot another way that cathode oxide layers get depleted and made porus: If anode or screen voltages are relatively high, say several hundred volts, iron bombardment may physically knock oxide material out of the cathode. But this mechanism should be unimportant with the RF pentodes I tested.
 
You're almost right.

Porosity increases thermal radiation, but it isn't the same as would be from a plane surface with the same total area as the area of the porous surface when the surface inside the pores is added. That's clearly wrong, as otherwise you could by drilling many deep tiny holes in an object, end up with more heat radiated than the object recieves - a perpetual motion machine!

The relavent affect is "cavity radiation", well known in thermal physics. As two objects near each other must come to temperature equilibrium after starting with one hotter than the other, a principle of reciprocity applies. A cavity in a a low absorbing substance must absorb incomming radiation almost as though the substance has near perfect absorbtion, as any incoming heat flux/ray can bounce around multiple times inside the cavity, loosing a bit of heat to the object at each bounce, until its' just about 100% absorbed. Due to the principle of reciprocity, a surface that absorbs well must emit equally as well, so a surface full of cavities or pores is a good thermal emitter.

This does not apply to electron emissivity though, due to reasons I don't fully understand, though it happens that rough surfaces are somewhat better electron emitters than smooth ones, for different reasons. Oxide cathode surafces are rough, but they aren't made rough to improve emission. Roughness inherently occurs due to the manufacturing process, which is designed so that chemicals stable in air at room temperature can be converted to a thermionically active form during evacuation and activation by special heating.

Don't forget that cathodes also loose electron emissivity and gain thermal emissity due to cathode poisoning and minute arc burns, especially if the tube is overloaded.

Also don't forget that anything that increases thermal emissivity is more effective at low temperatures, where emissivity is low (so there is more "room" for change). That's why I tested at 25% heater voltage. The increase in current at rated heater voltage would be so small you could not tell it apart from normal manufacturing variation.
I don't know the history of each individual tube in my collection, so the possibility of overloads cannot be eliminated.
 
Last edited:
Keit said:
Thermoelectric properties are something quite different
Yes, I meant 'thermionic'. As a physicist I ought to know better!

For typical oxide-coated cathodes, thermal emisivity changes smoothly from about 0.1 at room temperature to about 0.3 at design operating temperature (1050 K) - most things increase thermal emissivity as temperature increases.
I didn't know that. I thought most materials have a thermal emissivity around 0.9-1 whatever the temperature.

A); As cathode matetial is depleted (essentially, turned into gases, some of which coat other electrodes, and some which are absorbed by the gettering on the glass), the cathode coating, which is slightly rough to begin with, is left porus. You may remember from high school science that rough surfaces have greater thermal emissivity than do smooth surfaces. Porus surfaces have even greater thermal emissivity due to -in effect - they comprising lots of little "cavity radiators".
OK, I understand that.

Since the cathode oxide layer starts off with quite low thermal emissivity (its white or light grey), you can naturally expect that such chemical changes have little "option" but to darken it.
The fact that a surface appears to be white in visible light does not mean that it is 'white' (i.e. low in emissivity) in the infra-red region (where most of the radiation takes place at such low temperatures). I don't expect that a chemical change will darken a surface - it is just as likely to lighten it or leave it almost unchanged.

OK, I am now less sceptical than I was but I am not wholly convinced. It may be that your experiments have found an effect but your explanation needs further work.
 
Re thermal emissivity values:

If you are a physicist you should know that a perfect emitter (a perfect black body) has an emissivity of 1 - That means it radiates 100% of what The Stefan-Boltzmann Law says it can. Carbon is a near perfect black body with an emissivity around 0.95 or better at all temperatures up to vaporisation. A surface that does not radiate at all has an emissivity of zero. Polished metals approach this wityh emissivities down to 0.02 or so. Milll finish aluminium is about 0.1 as I recall.

It's very common that materials show an increase in emissivity as temperature rises. But this is over the temperatures encountered in vacuum tubes, up to 1200 K or more. You won't generally see much change over the range of temperatures humans can cope with, nor at normal engineering temperatures (say 0 to 100 C).

Re whether "white" means "white" at frequencies/wavelengths other than visible: To a physicist or engineer, "white" means white at whatever frequencies are of interest in the subsject at hand. It's the same as electronic engineers calling shot noise "white" when in most circuits the noise ranges over frequencies far far below the frequency range of visible light.

However, an arc burn results in a visibly dark spot. Even if the spot has not changed for non-visible wavelengths, it still means the spot's emissivity has increased.

It's common sense that if a surface has low emissity, any change is likely to increase it. Take a piece of white bond paper (emissivity ~0.15) - whatever chemicals you hit it with, it will get darker. It would be darn hard to make white bond paper any whiter. Conversely if a surface is dark, ie has high emissivity, any chemical change is likely to reduce emissivity.

Re whether my explanation needs further work: As I said in my answer for your question, I have given a simplified explanation, not knowing what you required, nor for that matter knowing what your background is. As you seem to be unfamiliar with the terminology and parameters of grey body radiators, cavity radiators, perhaps I pitched my answer about right, or maybe should have simplified it even more?

I make no claim to be an expert on this matter though. I could not find this method of testing tubes mentioned in any book - that indicates there may be something wrong with it - otherwise someone else would have written it up. That's why I started this thread - to see if anyone knows any pros or cons.

Incidentally, I've started to look at how it may work with directly heated output tubes and rectifiers. I've spotted some physical mechanisms possible in directly heated tubes that have an opposing effect, reducing current at low filament voltages. I'm not ready to talk about them yet, I need to work through the math and do some computer simulations. There are, however, much simpler and better ways to test directly heated tubes.
 
Keit said:
Re whether my explanation needs further work: As I said in my answer for your question, I have given a simplified explanation, not knowing what you required, nor for that matter knowing what your background is. As you seem to be unfamiliar with the terminology and parameters of grey body radiators, cavity radiators, perhaps I pitched my answer about right, or maybe should have simplified it even more?
I was an undergraduate physicist 40 years ago, and didn't do much on optics as I preferred theoretical and elementary particle physics - which I subsequently did some postgrad work on. However, I do know what a black body is and I know why a cavity approximates one.

It's very common that materials show an increase in emissivity as temperature rises.
I'm not sure whether you are saying that this is common knowledge (I'm not sure that it is) or that many commonly encountered materials show this effect. As most materials, other than untarnished metals, have high emissivity at room temperature there is only limited scope for an increase at higher temperatures. I guess surface effects could play a role, as cavity-like surface features would have greater effect as the peak emission goes up in frequency with temperature. This might be better described as 'emission moving into the wavelength region where the surface has higher emissivity' rather than an increase in emissivity - a bit of Googling suggests to me that some people seem to confuse these two effects.

There are genuine experts on many aspects of physical science on this forum, so don't oversimplify your explanations. By all means add a simpler version too, for the rest of us.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.