How to protect device from contamination through output

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Hi All,

Just wonder how do you guys protect your "Line Level" devices from interferences, noise etc entering the box from the output connectors/cables?

What I know so far is:
- buffer it
- adding resistor in series to decouple output stage from cable capacitance

I've got a box, which incorporates Active filter for mids/highs and Amplifier (analog AB) for mids. Output of HP part of the filter goes to another Amplifier (digital) without any protection (so it's HP sallen-key stage connected directly to RCA).
No wonder, I can hear some noise (weeping frequency) in Mid channel when connecting to external digital amp. There are no ground loops, and I strongly suspect that some RF coming into my main box from the digital Amp and decoded somewhere inside. (Digital Amp has correct RF rejection circuitry at it's own Input).

Is there correct solution I can implement? I expect that adding buffer +1k resistor might help me but maybe there is a more elegant way?

Thanks.
 
Hi All,

Just wonder how do you guys protect your "Line Level" devices from interferences, noise etc entering the box from the output connectors/cables?

What I know so far is:
- buffer it
- adding resistor in series to decouple output stage from cable capacitance...

That about does it. Use a low output impedance buffer and a 600 ohm series resister.

But I've always wondered why people are even using single ended interconnects. I can understand this in consumer "mid-fi" equipment that is cost sensitive. But if you care more about the sound and especially if you are building your own gear why not use balanced interconnects at +4dBu. This has been the "standard" in professional equipment for a long time. This system pretty much eliminate some of what you are talking about if it is common mode. After all, if the signal is not referenced to ground then noise on the ground does not matter.

So the better answer is (1) drive the output signal at a higher level (2) drive two wires in differential mode. Now you find that (1) the signal to noise is better simply because the signal is bigger an (2) noise sent up the signal line mostly cancels.
 
But you move the problem into the Power Amp.

In PA duty where amps accept +4dBu as nominal input and multiply that up by factors of 20 to 40times for high audience SPL, there is no problem.

But take that same +4dBu and use the typical 20 to 30times multiplication and you will find that the amplifier is clipping for almost all programme material and the speakers have blown up long ago.

Power amplifiers generally works best with a low input voltage and sufficient gain to get the output signal to the right level for domestic speakers.

Reducing the gain of a power amplifier generally reduces it's audio performance.

Then look back at 100mVac to 500mVac nominal input signals. We find that manufacturers can design single ended inputs and outputs that are sufficiently robust against interference such that noise is generally not a problem. The domestic situation is very different from PA.
 
Last edited:
But you move the problem into the Power Amp.

In PA duty where amps accept +4dBu as nominal input and multiply that up by factors of 20 to 40times for high audience SPL, there is no problem.

But take that same +4dBu and use the typical 20 to 30times multiplication and you will find that the amplifier is clipping for almost all programme material and the speakers have blown up long ago.

You have to design the amp for the expected input signal. Of course if you put the +4dB signal into a conventional home stereo amp it will be over driven.

The typical solid state amp has an absolutely huge open loop gain of many thousands and they get to the desired gain by using negative feed back. Tube amps are a little different as they use much less feedback. Tube amps will sound better with a lower gain driver stage and SS amps can simply use even more NFB. Both techniques will in theory improve the sound slightly

The proof that balanced lines "work" is that (almost) all of your recorded music be it vinyl or CDs or MP3 was created on equipment that uses this kind of signals. Almost certainly the monitor the recording engineer used had this kind of signal.

I don't understand how the lower gain power amp would have higher distortion. You will have to explain. likely the math is to complex for a post here so maybe you have a link?

Thanks for the warning about blown speakers, I'm going to record some vocals today and I'd hate to blow my monitor speakers. They have worked well for years. I must have been lucky.
 
Balanced connections require more complicated circuits, which ought to scare the golden ears who can hear a single resistor, let alone an additional op-amp. Transformers are simpler, but wide range audio transformers aren't cheap. Balanced lines are valuable in pro applications where you may be dealing with long cable runs, multiple power sources and grounds, and high-power interference sources like lighting dimmers.
 
You know I can't recall where I read info like that.

Sometimes I can't recall what I said yesterday.

But one that sticks in my mind is JLH. But I have no idea in which, or how many, article/s he states it.

I think Self & Cordell have also alluded to it. Or, maybe it was more blatant.
 
High input voltages can create more problems with common-mode distortion, so there is no point in increasing the input voltage beyond the point where acceptable S/N is achieved. For domestic purposes with shortish cable runs something in the 100's of mV region is fine. Something like 500mV means that no line stage is necessary.

For stage work with long cables, high audio power, lighting controllers creating buzz etc. and somewhat relaxed fidelity requirements, a higher voltage with balanced connections makes sense. Horses for courses.
 
Ex-Moderator R.I.P.
Joined 2005
You know I can't recall where I read info like that.

Sometimes I can't recall what I said yesterday.

But one that sticks in my mind is JLH. But I have no idea in which, or how many, article/s he states it.

I think Self & Cordell have also alluded to it. Or, maybe it was more blatant.

could be old and from before the digital media was invented, and as such outdated by new tehcnology
back then you had low voltage signals to deal with

yeah, I remember how it was, fighting for every single db, and all the noise that came with every step up on the ladder :eek:
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.