I am reading through the calibration procedure for my Hickok tube tester. the manual dates probably to 1947 or something.
In the manual, there is a section where they list voltages to check. But it is listed as such:
something something blah blah meter should read:
A. 150V +/- 2V on 1000 ohms/Volt Meter
B. 190 +/- 2V on 20,000 ohms/volt meter
What do they mean by Ohms/Volt meter? Im sure test equipment in 1947 is much different then the fancy digital stuff we have today. Looking at the spec for my fluke 189, it states an input impedance of 10 megohms. but i dont know if thats per volt or how those specs differ.
I am assuming that the reason for the voltage difference is due to loading of the lower resistance as stated. But what should i expect using my 189 to take the measurement?
In the manual, there is a section where they list voltages to check. But it is listed as such:
something something blah blah meter should read:
A. 150V +/- 2V on 1000 ohms/Volt Meter
B. 190 +/- 2V on 20,000 ohms/volt meter
What do they mean by Ohms/Volt meter? Im sure test equipment in 1947 is much different then the fancy digital stuff we have today. Looking at the spec for my fluke 189, it states an input impedance of 10 megohms. but i dont know if thats per volt or how those specs differ.
I am assuming that the reason for the voltage difference is due to loading of the lower resistance as stated. But what should i expect using my 189 to take the measurement?
ohms per volt is a spec for analog meters - it has to do with loading the circuit being tested. Presumably the tube tester cannot source a lot of current. Different movements have a different ohms/volt or sensitivity. [edit: your digital meter won't load the circuit much at all - you may get a 200V reading]
http://www.google.com/search?hl=en&q=ohms/volt&btnG=Google+Search
I'm sure you can figure it out from here....
http://www.google.com/search?hl=en&q=ohms/volt&btnG=Google+Search
I'm sure you can figure it out from here....
you should see around 193 volts on the DVM.
The loading with a 1000 ohm/volt meter on a 200v scale will be @ 3/4 ma (200,000 ohms / 150 volt), a 20,000 ohm/volt on a 200V scale @ 0.047ma (4meg/190v) from this you find the "Resistance" of the powersupply. A DVM will apply effectivly no load to the power supply.
As they specified the voltage with a load I'd be tempted to strap a 4meg high voltage resistor accross the powersupply as a load and measure.
The loading with a 1000 ohm/volt meter on a 200v scale will be @ 3/4 ma (200,000 ohms / 150 volt), a 20,000 ohm/volt on a 200V scale @ 0.047ma (4meg/190v) from this you find the "Resistance" of the powersupply. A DVM will apply effectivly no load to the power supply.
As they specified the voltage with a load I'd be tempted to strap a 4meg high voltage resistor accross the powersupply as a load and measure.
- Status
- Not open for further replies.