• WARNING: Tube/Valve amplifiers use potentially LETHAL HIGH VOLTAGES.
    Building, troubleshooting and testing of these amplifiers should only be
    performed by someone who is thoroughly familiar with
    the safety precautions around high voltages.

Leak Stereo 20 questions

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Administrator
Joined 2004
Paid Member
Accurate vacuum tube volt meters, and for those who could afford them good scopes from outfits like HP, Tek, Fairchild, etc.. I'm sure the UK had at least one comparable instrument manufacturer and metrology labs with high accuracy standards to calibrate them. It wasn't totally the stone age ya know... :D
 
Ex-Moderator
Joined 2003
It's those doggone transformers...

The problem was not so much measuring to 0.1dB - there were test sets having scales to 0.1dB resolution, it's more that such instruments were designed for professional (balanced) audio and they inevitably used a transformer for the input balancing. Despite various myths, 600R was not universal, so these test sets also needed to be able to present a high impedance load (typically 50k) and that's where the problem arises. The input transformer required to be able to present a 50k balanced load was generally not flat to 20kHz +/-0.1dB. It was only when op-amps enabled "electronically balanced" inputs with decent CMRR that test sets magically became flat to 300kHz +/-0.05dB or better.
 
My old Heathkit AC VTVM ran straight in, single-ended with a scope probe. It's been some decades now since I lost it, but I'm pretty sure it had 0.1dB resolution. It was designed for service people and amplifier clinics, not broadcast/studio use. I'd guess there were equivalent units from Eico, HP, and the rest of the usual suspects.
 
EC8010,

From what I can remember we had General Radio stuff at varsity and early at the CSIR, all with 600 ohm output impedance, although I am not saying that was general. The Leak TL12-type circuit with pentode input would not have suffered much until the source output impedance went above 10 K.ohm. It is these triode-input stages that would need a source impedance spec. for tests and amplifier output specs.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.