• WARNING: Tube/Valve amplifiers use potentially LETHAL HIGH VOLTAGES.
    Building, troubleshooting and testing of these amplifiers should only be
    performed by someone who is thoroughly familiar with
    the safety precautions around high voltages.

Vintage tube amp Aux In To CD In

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Hi Newbie Question here.

I have an old Pioneer tube SA810 and i realise after checking the schematic ( thanks ginger for the schema) that aux input level is rated at 170mV. As usual for nowadys medium, I have connected a cd to the aux input. There is a difference of volume when switch from phono to aux. Earlier i thought that the phono stage is faulty coz volume output is soft compare to the aux input.

My Question what is the best solution to convert the 170mV aux input to a 1V cd input without degrading the sound. Btw what is the new equipment output level for tuner and cassette player. Yes i still have cassette player. Will the strong signal affect the tone controls?
 
Last edited:
The strong signal will cause more distortion in the preamp part of the amplifier.

Just make a voltage divider out of two metal film resistors. This only requires two resistors (4 for stereo).

For example, you can use 100K and 10K, which will give you a reduction of 11:1. If you use the very small 1/8 watt resistors, you can even put this *inside* a RCA cable connector.
 
Kavermei is correct about using a resistive voltage divider. However, you should take advantage of the fact that commercial CDPs can drive the 10 KOhm IHF "standard" load. Doing so allows the voltage divider to be lightly loaded by the amp's I/P impedance, which maximizes fidelity.

Construct the voltage divider from parts whose total resistance is at least 10 KOhms, but not much more.

The suggestion to build the dividers into the RCA plugs, at the amp end, out of 1/8 W. 1% tolerance metal film parts is excellent.
 
Well in that case it would be simple to connect 10K pot and i would vary it till it sounds ok to me. Measure it and fixed it with a fixed resistor. I was thinking of installing it in the amp.
Btw i have checked various specs on modern amp and found out that the sensitivity level is around 170mV also same as my vintage amp. (My input phono stage might be the problem) Is my vintage considered a common aux input? What is difference between sensitivity and output rated level?
 
It depends. If, in your particular amp, the volume control comes before the first amplification stage, then you won't have overload if you keep the volume control low enough. But it is still a hassle, because you can't use the full range of the volume control.

However, if in your amp the volume control comes after a first amplification stage, then than stage can easily be overloaded by the CD player. This will result in more harmonic distortion (which, in tube amps, most people find is not necessarily a bad thing).
 
Since you have the schematic, I would think that the most practical and sensible for good sound would be to eliminate the first gain stage. I could help if I had a schematic, but in general there should be a way to jumper the input that you expect to use for CD to the volume control top. Eliminating a gain stage will increase transparency while knocking gain down with more pots will only loose transparency along with the signal.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.