Impedence Matching - Preamp and Power Amp

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
I am designing a simple transistor preamp to drive my power amp. The power amp has an input impedance of 75K. So

1. What should be the input impedance of the preamp considering it will be connected to an ipod?

2. What should be the output impedance ? I assume 75k since the power amp has an I/P impedance of 75k. So i guess that should match.

3. Should I decrease the input impedance of my power amp by connecting a parallel resistor .Would that help in any case ?

4. Which transistor config would you prefer..CE, CB or CC type ?

Thanks
 
The input impedance of the power amplifier is limited by noise. So it should be as low as practical (i.e. in the region of a few kOhms).
However, this is important mainly for the closure resistance of the feedback network, since the actual imput sees the preamplifier output impedance in parallel. As a rule of thumb the best way is to work with a very low output impedance for the preamplifier that should be able to drive quite a low input input impedance for the power amplifier. In my mind I have the output of an operational amplifier (i.e. less than 1 Ohm) for the preamplifier, connected to 2 kOhms imput impedance for the power amplifier.
 
for good signal transfer between our equipment modules we use "voltage" transfer.
For good voltage transfer it is better to have a low source impedance at the output of the source and to have a high impedance at the input of the Receiver.
These two give a good transfer if the ratio of source to receiver impedances is greater than 1:5
eg.
Source output impedance 1k ohms.
Receiver input impedance 5k ohms.
This pair will transfer ~83% of the source's output to the receiver.
Many will try to improve that transfer efficiency by using a higher impedance ratio of >1:10
I typically aim for 1:20 which transfers ~95% of the voltage.

If one were interconnecting 3 pieces of equipment: A source with an output impedance of 1k ohms, a selector+vol pot with an input impedance of 10k ohms and a power amplifier with an input impedance of 75k ohms, then we need to look at each interconnect and assess the impedances at each end of the interconnect and decide if they are optimum or need help to perform well.

Source to selector: 1k0 : 10k - OK for both noise/interference and for voltage transfer.
Selector to power amp: ? ohms : 75k
To achieve 1:5 the vol pot attached to the selector should have an output impedance of <15k ohms. This requires a vol pot of <60k, one can use a 20k or a 50k vol pot.
This should be OK for both noise/interference and voltage transfer efficiency.

Parasitic capacitances affect performance.
If we use very low impedances then parasitic capacitances allows very high frequencies to pass easily with little filtering. The 1k:10k into the selector would work quite well with short or medium length coaxial cable for the interconnect.
However the higher impedance of the second link, 15k:75k will be affected by high parasitic capacitance. Ordinary coaxial cable fo more than a couple of metres will operate as a filter and start to remove some of the extreme treble content of the signal.
If one can avoid very high impedances then cable capacitance is less of an issue. This also helps reduce relative effect of the noise produced by high resistances. So one gains twice by adopting lower impedances at the ends of out interconnects.
 
1. Input impedance should normally be minimum 10x the source's output impedance. In case of an iPod which can drive 32 Ohms headphones, anything more than that will be fine. If you one day want to connect a hi-fi CD player or similar, then the DIN45500 norm state that you should not load the source lower than 47 kOhm.
However, if you aim for anything at or above 10 kOhm it will work perfectly with almost everything.

2. Optimum output impedance should be maximum 1/10 of the load, in this case 7.5 KOhm or lower, which easily should be overcomed.
Aim for something like a few hundred ohms.

3. Not necessarily, but it is often a good way to define an actual input impedance, and may also tie the input to 0 volt bias.
The theories of perfect high frequency transmission by impedance matching the sender with the receiving end has never caught on in hi-fi, and normally you just seek not to overload the previous stage's current capability.

4. If you need gain then use a number of CE stages with degeneration or global feedback, and add a CC stage as output buffer.
The CB type does not have any obvious advantages in a line stage amplifier, nobody uses it for this.

If you need inspiration or proven and testet projects, check out these:

Minimalist Discrete Hi-Fi Preamp
Bride of Zen
JLH circuits by Paul Kemble
....or many other projects presented in this forum.
 

ICG

Disabled Account
Joined 2007
I am designing a simple transistor preamp to drive my power amp. The power amp has an input impedance of 75K. So

1. What should be the input impedance of the preamp considering it will be connected to an ipod?

2. What should be the output impedance ? I assume 75k since the power amp has an I/P impedance of 75k. So i guess that should match.

3. Should I decrease the input impedance of my power amp by connecting a parallel resistor .Would that help in any case ?

4. Which transistor config would you prefer..CE, CB or CC type ?

Thanks

  1. High is good, 50k would be fine.
  2. No, it should always be lower than the input of the amp! Low output impedance since it means the signal will be more stable and will work fine on amps with a low input impedance or long cables. Aim for ~5k, 10k is still fine. A lot of japanese high end preamps are <1k, the TA-E77ES i.e. is 47 Ohm.
  3. No. Don't expect any benefits from it. If you need to adjust the input gain or impedance I'd suggest using a potentiometer or a voltage divider.
  4. That depends on the circuit and your preference and goals (not ours). OPs and FET are fine too, depending on what you want to achieve.
 
attilaunnoita said:
The input impedance of the power amplifier is limited by noise. So it should be as low as practical (i.e. in the region of a few kOhms).
No. Well, not if the power amp has the usual non-inverting configuration - where the input noise is set by the source resistance not the input resistance. Only for the inverting configuration is the input noise set by the input impedance, which is why the inverting configuration is not normally used for power amps.

In my mind I have the output of an operational amplifier (i.e. less than 1 Ohm) for the preamplifier, connected to 2 kOhms imput impedance for the power amplifier.
2k input impedance is very low, and must be considered poor design. An opamp may have a very low output impedance, but it is usual to add an output stopper resistor of a few hundred ohms to help ensure stability.

So, broadly, you want input impedance of the power amp to be typically in the range 10k-50k (or even higher) and the output impedance of the source to be typically in the range 100-1k.

Jrp27 said:
3. Should I decrease the input impedance of my power amp by connecting a parallel resistor .Would that help in any case ?
No.
Nrik said:
. Not necessarily, but it is often a good way to define an actual input impedance, and may also tie the input to 0 volt bias.
The theories of perfect high frequency transmission by impedance matching the sender with the receiving end has never caught on in hi-fi, and normally you just seek not to overload the previous stage's current capability.
No. No need to deliberately lower the input impedance.

The reason that "theories of perfect high frequency transmission by impedance matching the sender with the receiving end has never caught on in hi-fi" is that it is irrelevant and false for audio unless the cables are extremely long, and then it actually gets more complicated than for RF. RF and audio are different: for RF cables are dominated by L and C and cables are long so impedance matching may matter; for audio cables are dominated by R and C so impedance matching is almost impossible but fortunately cables are short so this does not matter.
 
No. No need to deliberately lower the input impedance.
You are right, and in my answer I misread this to be a question for OPs preamp circuit, and not his poweramp.

The reason that "theories of perfect high frequency transmission by impedance matching the sender with the receiving end has never caught on in hi-fi" is that it is irrelevant and false for audio unless the cables are extremely long, and then it actually gets more complicated than for RF. RF and audio are different: for RF cables are dominated by L and C and cables are long so impedance matching may matter; for audio cables are dominated by R and C so impedance matching is almost impossible but fortunately cables are short so this does not matter.

I agree. However this would be a perfect and legit reason for a lower input impedance.
Instead the audio-phools have replaced this with snake-oil and mumbo-jumbo technical explanations.
I would prefer if they at least had taken liking to the transmission impedance matching techniques.
 
Administrator
Joined 2007
Paid Member
That is similar to the circuit you were posting the other day, just different values. Gain is less than 2 at around 1.55 and input impedance below 40k.

It wont deliver much more than a couple of volts RMS into a 50k load.

Do you even need a preamp ?
 
amp1.jpg
That's wrong.
I asked a few days ago if you wanted to see how to calculate and you remained silent.
 
That is similar to the circuit you were posting the other day, just different values. Gain is less than 2 at around 1.55 and input impedance below 40k.

It wont deliver much more than a couple of volts RMS into a 50k load.

Do you even need a preamp ?

A couple of volts is all I need to fire up the power amp. The power amp has a input sensitivity of 440 mV so i guess a couple of volts would properly drive the amp.

Plus this preamp would also act as a damping unit when the power amp is turned off. Without the preamp the capacitors discharge slowly.
 
An externally hosted image should be here but it was not working when we last tested it.
start with your max Vout and max Iout requirement to drive the load.
Let's suppose you want 2Vac into 2k ohms.
That equates to 1mAac
Peak current is 1.4mApk. Assume another 1mApk is required to drive cable capacitances taking you to 2.4mApk
The simple amplifier starts to develop very high 2nd harmonic as the output current approaches 50% of it's maximum output.
That gives us a target max Vout of 4Vac and a target max Iout of 4.8mApk of which we only plan to use 50%.

4Vac requires the +12Vdc supply.
4.8mApk requires a bias of ~5mA.
R3 needs to drop ~ half supply when passing 5mA. giving R3=6V/0.005 = 1200r
Your gain is ~2times. That sets R4 to ~600r (use 560r or 620r) You definitely don't need E96 values.
Assuming the base current to be negligible then Vdrop across R4 is 0.005*560r = 2.8V
The emitter of T1 sits at ~2.8V
The base is required to sit at 2.8V plus Vbe. assume Vbe=650mV.
That places the base @ 3.45V
3.45V is the required bias voltage from the two series connected resistors R1 & R2. (this is the bit you got wrong in all your examples).

Bias voltage is 3.45V/12V = 28.75% of the supply voltage.
To get 28.75%, take the reciprocal and subtract 1 to get 2.478 (1/0.2875 -1 = 3.478 -1 = 2.478)
if R2=100k then R1 = 100k * 2.478 = 247k8, use 240k
Then starting from the adopted 240k work back through the calculation in the reverse direction to arrive at a prediction for the bias current and the Collector voltage.

If you want/need lower distortion, then move to a two or three transistor version with global NFB. (there's lots of examples in website tutorials or copy a JLH)
If you want to use a lower load impedance, then adopt a higher bias current. (your 7.6mA bias current allows upto 2mAac to your load, i.e. 2Vac into 1k)
If you want/need lower output impedance, then add on an emitter follower/Buffer.

BTW, this same prediction method could be applied to a headphone driver amplifier as NOT done in the XRK Threads. But you do need to take account of the extra current demanded by a moving coil driver (maybe an extra 50% for fast transients).
 
Last edited:
Thank you. That was very well explained. A few queries however....

1. Instead of using input resistors to lower the impedance. Wouldn't it make more sense to use a CB followed by a CE. Wouldn't that also eliminate the use of higher biasing current?

2. How do you calculate at what O/P voltage clipping would occur? Is it 2X times the max out voltage?
 
In all four of my transistor texts, I've never seen anything like post 18. Thanks Andrew.
Input resistors R1 R2 IMHO are to set the base input operating point at 3.45 v like he said. The overall parallel impedance (R1*R2)/(R1+R2) is set by how much gain the transistor has, how much current in the base is required by the desired collector current. 50k input impedance is typical of 70's transistors. The input impedance of an amp stage has to be higher than the output impedance of the previous stage, or loading down the source occurs. With TO92 transistors typically having gain of 300 these days, you can get away with input resistor divider 1 Megohm and 2.4 Megohm, like I just did on the AX6 copy. I needed 100k input impedance because my 12AX7 tube preamp can't drive lower impedance than that. You cell phone has more like 2 or 3 k output impedance, that is it distorts or the output device overheats if the load is lower than that. That impedance includes the cable capacitance parallel your preamp input impedance.
The second harmonic distortion Andrew mentioned is the beginning of clipping. So if you want 2 v out, he said one wants the max swing of collector to be 4 v.
 
Last edited:
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.