tube testers: different gM standards for same tube?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
I have two tube testers: a Hickok 800 and a Hickok 752. Both measure
GM. For a 12AU7 tube, the 800 standard test target value on the roll
chart is 2200 uMhos for each triode. For the same tube, the 752 roll
chart lists 675 uMhos in the tester's "X2" position, i.e. 1350 uMhos
for each triode.

That's a 40% (or 60%) difference. This seems strange to me. Why should
a tube's target value be different on different testers (of the same
mfg!), when the unit is a standard one (uMho)? I can understand
factors (like calibration) that would make the *measurements* differ,
but not the *standards*.

Assuming the testers are actually calibrated to the same values and
are accurate (I know, a big if), this would mean that a tube testing
above standard on one tester could test below standard on the other
tester. 40-60% difference is a big window. How do people generally
deal with this?

- Les
 
Just had a discussion of this elsewhere. A 9v battery has 9v coming out of it, that is a clear parameter to measure to test the battery. A tube data sheet shows a whole family of curves. And that is the point Defiant makes. There is no one point on all those curves that defines the tube.

How fast does a car go when the motor is turning 2000 RPM? All depends upon what gear you are in. If two tube testers use different voltages and different loads, then the results will also be different. SO the tester infers that if you get this reading on this tester, your tube is probably good.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.