VTVM frequency-dependent drift

I am troubleshooting the frequency-dependent measurement drift on my newly acquired CTR HRV-260 VTVM in the audio range.
VTVM is connected to DAC which outputs 20Hz-20kHz sweep at 245mV (-10dBVU) and 775mV (0dB). I can confirm that output is linear with another instrument. On 0.1V and 1V ranges, the measurement is accurate and the needle is steady across the frequency range. However, switching the VTVM to 3V range or higher, the same signal will not be measured correctly, by a few dB. Moreover, as the sweep frequency goes up, so will the needle, by 5-6dB. This does not happen on measurement ranges below 3V.

The device was cleaned and calibrated. I am attaching the schematic and kindly ask for advice where to look for a fault. Thank you in advance.
 

Attachments

I would first confirm flattness of the DAC output at various signal levels using an oscilloscope.
Then again, looking at your attached schematic, the input attenuator changes frequency compensation just where you have noticed the issue (btn range 0.02V to 1V and range 3V to 300V). Maybe you should check the values of the compensation network and adjust using a 1kHz square wave.
And as always, reseat the valves at their sockets 🙂

George
 
Last edited:
George, thank you for your quick reply. I did clean the device, including tube pins and sockets. Flat output from DAC was confirmed with another VTVM and also with an oscilloscope.
I understand the frequency compensation is adjusted with 30pF variable capacitor. Could you please explain how to adjust compensation by using 1kHz square wave? Thanks again.
 
Hi Willi
Based on what you have reported, without going further with the adjustment (it's just like trimming the compensation of an oscilloscope probe), I would first measure the components at the input side of the attenuator 10MOhm, 30MOhm and 2000pF (one leg of the capacitor has to be disconnected from the circuit and the first tube removed).
If you can do this, please report back.
5% tolerance is to be expected. Be suspicious when more than 10% to 15% of the nominal is observed.

George
 
So I finally went back to this VTVM to try to fix the uneven frequency sensitivity on ranges from 3V upwards. Before doing measurements as suggested by George, I tried to adjust the frequency discrepancy via 30pF varicap. The varicap appears to be OK, as the needle moves smoothly when turning it. With 3V range on the meter, I took 1kHz 1V as the reference and tried to attain the same indication at 10kHz, by turning the varicap. I was able to do it, however this now disturbed the 1kHz 1V indication, as the varicap will also affect the lower frequency, although to a much lesser extent. Next, I tried to equalize the needle position at both frequencies by turning the varicap, so that there was no discrepancy, but now both frequency readings are off 1V mark.

In the end, I didn't figure this out and simply decided to calibrate 3V+ ranges for perfect indication of 1kHz frequency because this is what I use the most. But, still, the device is only half-working. Next step: verify values of components in the input filter. Will report back.
 
The input divider can't be 10M(eg) to 30 (micro? kilo?). A rough thumb-count says the lower part is 33.33k.

Separate the problem of the lows from the highs. Use as low a test tone as you can read well. Even 50Hz. (The 2000pFd/33K crossover is 2,500Hz and you want to be 10 or 100 X away from that for few-% accuracy.) Check or trim input divider resistors. When you can cross the 1 to 3 point at 50Hz accurately, then test at 50kHz-100kHz and trim the capacitor.
 
Hello PRR and thanks for your reply. There could indeed be an error in the schematics as the lower resistor should be 1/100 to 1/1000 value of the upper resistor. I can already set correct indication of frequencies up to 2.5KHz between 1V and 3V range. There is no problem there now.
Are you suggesting that I should be first working on voltage divider resistors on selector and not on the input filter network?
Thank you again.
 
Last edited: