Tube line level?

I've seen "modern" line level audio spec'd at 1Vpp (300mVrms), but was there a different standard back in the tube days? I recently restored a 1947 Pilot T-601 FM tuner. It's output appears to be about 15 dB higher than my SS tuners. Also, I notice that my SS equipment doesn't have enough drive for my old tube Pilot amplifier.

BTW, Is line level spec'd at full output (0dB), or some percentage of full output? If one were designing an audio component, would you set it's maximum output for 1Vpp line level?

Bobby Dipole
 

bob91343

Member
2010-03-11 10:43 pm
Line level has been 1 milliwatt into 500 Ohms, typically 0.775 Volts rms. Japanese gear seems to have standardized on about 150 mV.

To answer your last question, these are maximum levels before distortion becomes excessive. In the old days that was around 10% THD but with the advent of solid state, it's the clipping level. Which is about the same figure anyway, since distortion rises from negligible to excessive over a narrow range of signal. That being said, there is often some headroom, maybe 3 dB or so, as evidenced by the red zone on most VU meters.
 
gain change

My 1966 design Dynakit ST120 power amp had the gain set for the PAS2 tube preamp, with a 150 gain select input transistor after a 5 uf input cap. The seventiesh "TIP" mod decreased the gain of the input transistor to 100 or so for modern equipment like the PAT4. I have to turn the volume on my 1998 CS800s power amp up pretty high to equal the gain of the ST120 with most of the TIP mod, but still the 150 gain input transistors. I still use the PAS2 for best signal/noise, although the stuck volume pot is a nuisance.
 
Last edited: