Input impedance of PRE vs sensitivity to disturbance

Status
Not open for further replies.
When you have a very high input impedance on your pre-amp it is an easy load for the sourse.

Unfortunate such signals (hardly any current flowing) can be sensitive for external disturbances (radiated or conducted) to generate some noise on the volgate signal at the interlinks.

Current signals are however less sensitive to such disturbance.
I have a source with low output impedance and able to drive some load so would it be beneficial to lower the input impedance at the PRE to get some current flowing.

Practical info:
My DAC out = 100 ohm
My PRE in = 500 Kohm
I guess even 10 K input impedance would be fine.

In the input stage is a 1 Mohm parallel resisitor so I guess the input buffer has also an impedance of 1 Mohm which makes 500 K as load.
The parrallel resistor is easy to swap for something smaller.

Am I looking for a solution to a non-existing problem or does it make some sence ?
 
There are two ideal ways to transfer an audio signal:
voltage: low impedance source, high impedance load,
current: high impedance source, low impedance load.
The first is what we all do now. The second was, I believe, the old German DIN standard which never really caught on. People in Europe eventually used DIN connectors with voltage signals, before RCA/phono took over the world.

Attempting to use an intermediate, with source and load impedance being similar in magnitude, makes the system very sensitive to the linearity of those impedances so is normally best avoided.
 
I know the input impedance of the PRE must be much higher than the output impedance from my DAC.

500 Kohm vs 100 ohm = ratio 5.000

My suggestion to lower the input impedance to 10 K (as an example) stil gives a voltage driven system with impedance ratio of 100 which is actually not uncommon in todays audio setups.

To promote a lillte higher current flowing (2V RMS over 10 K vs. 500 K) might make it a little more robust or insensitive to external disturbances.

Who can comment on this ?
 
I know the input impedance of the PRE must be much higher than the output impedance from my DAC.

May be, but it is not needed to respect this. It depends on type of signals, current, voltage or power in the wire. As DF96 said. In RF it is important power, so equal line, source and load impedances are seeked (75 or 50R lines).

If your DAC gives current, then a lower impedance in the receiver (respect to the output of the DAC) will be of benefit.
 
Last edited:
That is exactly my question ...

I guess also the output circuit of the DAC has some influence how it will dampen signals not generated by itself. Isn't it more complex than only trust that the low output impedance will take care of it.

My impression is that the higher the signal current is the less influence any radiated / magnetic disturbance will have on the signal.

If 1 uA error current has some effect on a 4 uA signal current (2V RMS / 500 K) the same 1 uA will have less influence on a 200 uA (2V RMS / 10 K).
Not taking into account the ability from the DAC correcting the error current.

The more I think of it the more complex is seems to get ;-)
 
I guess the strength of generated noise by the music signal itself is minimal compared to the external generated noise by antenna's power cords etc...

So a preliminary conclusion could be that a higher load current will have more benefits than drawbacks ?
 
It would be interesting, to modulate a well filtered high DC current, and then carry the audio + DC at a long distance in a cable. Then, the DC provides low impedance at high currents. (Say, something like the well known current loop in old digital suff, say, 20mA).

Keep in mind that the pot's slider is at a variable impedance over a fixed capacitance wires, so bandwidth will vary with pot setting.
 
Last edited:
Sometimes equipment is sensitive or not to interlinks as sometimes is mentioned in the equipment reviews.

Is this also related to the input/output impedance matching and leakage currents (or the inability to handle these leakage curents) what makes the equiment more or less sensitive ??
 
Input sensibility is referred mainly to 2 conditions:

1) Voltage gain and limit of the output stage capability, and
2) Noise floor of the input stage.

Normally in analog audio impedance mismatch if of no problem, as yes it is in RF and video signaling. Reflections in cabling because of impedance mismatch causes overheating of the driver in RF and echoes (Ghosts) in video.
 
Sensitive I meant in realtion to the interlinks ... not signal (gain) sensitivity ...

Is a component sensitive to the interlinks used (sound quality) ... some equipment require critical matching while others will work will have little effect on the interlinks used ... think of high-end and cable matching ...
 
In a high-end set interlinks will influence the sound.

Some equiment benefit more from changing interlinks and with other equipment it will have a smaller effect changing interlinks.

Some impedance matching and/or leakage currents could be responsible on the impact ?
 
Last edited:
I'm not sure about the question, but as the driver's output impedance of a cable and the receiver's one, forms a voltage divider. Then the voltage at the receiver end is always less than the open circuit voltage. And, both together with the Thevenin´s theorem gives an resistance in the circuit. The capacitance in the cable shunts this resistance giving a susceptance in // to the input conductance (inverse of the input resistance), forming a frequency sensible voltage divider, creating a low pass filter and a phase delay in the wire, independent of the cable connector at the ends, supposing them well connected.

Sorry if my English isn't too good.
 
Status
Not open for further replies.