Input and output impedance

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
I know it's a silly question, but still want to get the right answer.

Why we should design the output impedance of the source (e.g. DAC) as low as possible while the input impedance of amplifier should be as high as possible? I know it's about the problem of distortion but can someone explain the theory behind the scene?

Also, why the input impedance of a speaker is commonly designed as 8ohm? Why not design a high input impedance speaker (300ohm/600ohm) similar to headphone?
 
If a source has a lower impedance than the input device, there is a lower possibility of a drive speed issue, ensuring the square wave edges are nice and clean, loading the source will make the edges of the waveform rounded as the source has trouble driving the required current to keep the edges of the wave pattern clean.
There are speakers with up to 32R impedance. Headphones of 600R are not uncommon but a loudspeaker ... to produce 32Watts from a 300R loudspeaker will require over 100Volts of drive at 333mA. 100Watts will require dangerous voltages for speaker wiring. Not practical and no longer Class 2 wiring due to voltages employed.
 
AX tech editor
Joined 2002
Paid Member
I think you will find that the field coil was 800R and used as a choke. The impedance of the speech coil was 15R and used an output transformer in the models I have worked on in the distant past. I may have missed the odd special though.

Yes you missed it ;-) 9710AM was 800 ohms voice coil, there was also a 400 ohms version. My very first amp was a 2 x 807-cyclotron driving 800 ohms speakers.

Jan
 
AX tech editor
Joined 2002
Paid Member
I know it's a silly question, but still want to get the right answer.

Why we should design the output impedance of the source (e.g. DAC) as low as possible while the input impedance of amplifier should be as high as possible? I know it's about the problem of distortion but can someone explain the theory behind the scene?

It is not specifically against distortion, basically the distortion doesn't play here. It is about driving long cables with high capacitance, or inputs with high capacitance, which can cause premature roll-off if the Zout is too high.

Jan
 
The output impedance of the source and the input impedance of the destination form a potential divider, which adds signal attenuation. Minimising the source impedance and maximising the load impedance means that the attenuation is minimised. As Jan says, there is also a matter of driving capacitance.

In addition, it is often the case that impedances from active circuits are somewhat nonlinear. Making them very big or very small may be easier than making them more linear, but this is equally good at reducing distortion.
 
When it comes to getting signals from A to B, there are three particularly beneficial ways of source and input matching commonly in use:
Voltage transfer. Output approximates a short, input approximates open. This puts any cable capacitance in parallel with a short and cable inductance in series with an open, minimizing the effect of both parasitics.
Current transfer. Much like the above but in reverse. Output approximates an open (current source), input is a short (current input).
Impedance matching. Z_in = Z_out*. Mostly relevant at RF (or whenever wavelengths get too small to warrant a lumped component model approximation, which the above two are based on) as characteristic impedance of transmission lines becomes a major issue there, and mismatch causes unwanted reflections. Yields maximum power transfer.
 
Why we should design the output impedance of the source (e.g. DAC) as low as possible while the input impedance of amplifier should be as high as possible?
It's all to do with maximum voltage transfer.

If connected to an amp with low input impedance, sufficient current will be drawn from the source to result in a significant proportion of the generated voltage being ‘lost’ across the internal impedance of the source. The result is that only a small proportion of the total voltage generated by the source will be available at the amplifier input.

(Compare this to when the voltage across the terminals of a 9V battery drops as current is drawn from it by a load. Some voltage is ‘lost’ in driving current through the internal resistanceof the battery, leaving less than 9V available to the external load.)

To minimise this ‘lost volts’ effect, an amplifier input should draw as little current as possible from the source. This explains why the amplifier input impedance should be much higher than the source impedance.
 

TNT

Member
Joined 2003
Paid Member
It's all to do with maximum voltage transfer.

If connected to an amp with low input impedance, sufficient current will be drawn from the source to result in a significant proportion of the generated voltage being ‘lost’ across the internal impedance of the source. The result is that only a small proportion of the total voltage generated by the source will be available at the amplifier input.

(Compare this to when the voltage across the terminals of a 9V battery drops as current is drawn from it by a load. Some voltage is ‘lost’ in driving current through the internal resistanceof the battery, leaving less than 9V available to the external load.)

To minimise this ‘lost volts’ effect, an amplifier input should draw as little current as possible from the source. This explains why the amplifier input impedance should be much higher than the source impedance.

Really... wouldn't you trade a little level for less distorsion?

//
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.