diyAudio (
-   Tubes / Valves (
-   -   mA meter drags the circuit down (

Burnedfingers 26th February 2012 10:34 AM

mA meter drags the circuit down
I have a 0-1mA D.C. Milliamperes meter by the Beede Electrical instrument Company Inc that I wanted to use as a testing meter for tube bias. I measured the meter and it measures roughly 45 ohms. I thought maybe this was too low and that I needed some type of circuit to help so that when I used the meter I would obtain the correct reading without pulling the circuit down. I used a TL082 op amp in which to drive the meter thinking that it would raise the impedance of the meter as seen by a circuit. It didn't seem to work as I wanted it to. Measuring tube bias in a circuit with a 10 ohm resistor from cathode to ground with a standard VOM meter I saw .350 volt across the 10 ohm resistor and with the meter and circuit it dragged it down to about .300 VDC across the 10 ohm resistor.

Any ideas?

SY 26th February 2012 10:52 AM

Are you trying to insert the meter directly in the cathode circuit or put it across a sensing resistor that's already in the cathode circuit?

Burnedfingers 26th February 2012 11:48 AM


Thanks for the reply.

I was putting the meter across the 10 ohm resistor as I do the VOM meter.

M Gregg 26th February 2012 12:11 PM

If you put the meter across the 10 ohm you have resistors in parallel so yes it will pull it down.
You need either a high resistance multiplier and read the voltage or a small value shunt in series with the cathode resistor.
This may help..

M. Gregg

DF96 26th February 2012 12:20 PM

I have an idea. You show us the circuit, so we know exactly what you did with a TL082. Then we will think about where things went wrong. The alternative may require us to read your mind; we are good, but not that good!

Burnedfingers 26th February 2012 12:40 PM

1 Attachment(s)
Meter circuit

DF96 26th February 2012 12:46 PM

That looks OK, but is that what you built? Check your wiring, and the opamp pinout, very carefully.

SY 26th February 2012 12:49 PM

OK, so what you want to do is apparently to convert your milliammeter into a voltmeter. You can do that with a simple series resistor- see the Rod Elliott article for the calculations- no opamp needed. Your cathode resistor is 10R, so the voltage across it (0.350V) means you have 35mA. Where do you want that on the milliammeter? A convenient spot is 0.35mA (but you can make that anywhere you like). The total series resistance in the meter circuit is R = 0.35V/0.00035A = 1k. You have 45R already, so the series resistor will need to be 955R.

The 1k total meter resistance shunted across the 10R sense resistor will perturb the reading a bit, but not much.

artosalo 26th February 2012 12:50 PM

Just put 955 ohms ( 1k - Ri of your meter) in series with the meter.
Then it forms a voltage meter with max. reading of 1 V.
Now you can place this tester directly across the 10 ohms cathode resistor.
Full readind 1 V represents 100 mA.

All times are GMT. The time now is 11:33 AM.

vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2017 DragonByte Technologies Ltd.
Copyright 1999-2017 diyAudio

Content Relevant URLs by vBSEO 3.3.2