Is there a reason to use RCA over BNC connectors to transport analog audio?

Status
Not open for further replies.
If you're designing your system custom from scratch (you're designing the preamp, power amp, etc), and you prefer the locking mechanism of BNC and don't care about the extra cost, is there a technical/quality reason to use RCA instead?

There is the characteristic impedance issue, but are reflections a big deal at audio frequencies if the cable length is made short? I can impedance match at the termination point but at the cost of slightly higher power dissipation, so even that can be taken care of.

Any other issues I'm not thinking of?
 
because the live pin touches first before ground.

Not with the Neutrik Profi RCA connectors. They have a retractable sleeve that makes ground contact before the pin, and the reverse when extracted.
I find that even at $ 10 each they beat a lot of so-called hi end RCAs.

But I do agree that BNCs are better, until you want to mix and match with your friends system ....

jan
 
I second the notion that if designing from scratch, go balanced.

Impedance matching is a non issue at audio (unless you are the phone company), pretty much everything audio these days is voltage and not power tranfer driven.

Note that balanced does not necassarily require much extra at the sending end (Different connector and one extra resistor in the simplest case, impedance balanced IS a perfectly valid implementation), the recever can usefully be one of the bootstrapped parts from THAT corp, or a transformer, or the traditional instrumentation amp type affair, depends on your taste really.

Regards, Dan.
 
It's not only about the connectors. I prefer balanced amplification topology over unbalanced one. In every section / module / stage. From the source to the speaker. Well, you need the sources with balanced outs to really benefit from such system 😉
 
Actually that (Dully balanced topology) is problematic in itself.

Consider that you need good CMRR at the input, best way to get that is to unbalance right next to the connector, thereby removing the common mode induced in the external wiring, then balance it again if your following stages need it.

Going literally fully balanced is asking for relatively poor CMRR because the balance of everything ends up mattering, much easier to get 60+dB of CMRR if the input stage is the only part that needs the 0.1% parts.

Regards, Dan.
 
Not sure I really agree, if we allow that nominal line level is a couple of volts or so at clipping, and we are typically running some 40dB or more below clipping at the feed to the power stage, then our normal sort of listening levels (when not cranking it) are as little as 20mV or so on the line betwen the preamp and the power stage.

Now if I want induced noise to be at least 60dB below my nominal listening level then that means that after taking the CMRR into account I need to be looking at well under 10uV residual, CMRR still matters.

Now listening to a classical CD of the not totally loudness war victim sort, a pp passage may be at least 40 down on full scale even before the volume control gets involved, turn the volume down a bit and 60 down is not impossible and we still should not hear hum.

Now maybe it is that I hang around theatres where there are **MANY** KW of phase angle controlled lighting, but if you want a big rig to be quiet then CMRR really does matter.

Regards, Dan.
 
I built a couple of phono stages with BNC in the early 90s. Biggest problem I found was getting a good earth connection with std audio cables as BNCs are designed for specific coax. I never quite worked out which size would be right and was too embarassed to ask the techs to do it for me. Serves me right for not buying the right cable but all the RF coax at work was steel core and stiff and at the time couldn't find what I needed in small quantities. If I were doing it again would probably be everything on XLR from phono stage up even if single ended.
 
Not sure I really agree, if we allow that nominal line level is a couple of volts or so at clipping, and we are typically running some 40dB or more below clipping at the feed to the power stage, then our normal sort of listening levels (when not cranking it) are as little as 20mV or so on the line betwen the preamp and the power stage.

Now if I want induced noise to be at least 60dB below my nominal listening level then that means that after taking the CMRR into account I need to be looking at well under 10uV residual, CMRR still matters.

Now listening to a classical CD of the not totally loudness war victim sort, a pp passage may be at least 40 down on full scale even before the volume control gets involved, turn the volume down a bit and 60 down is not impossible and we still should not hear hum.

Now maybe it is that I hang around theatres where there are **MANY** KW of phase angle controlled lighting, but if you want a big rig to be quiet then CMRR really does matter.

Regards, Dan.

About 3 years ago I had a setup with Cambridge Audio DAC Magic (balanced low impedance line out), connected to the balanced pre-amp (unity gain - subsonic filter, volume control), connected to the balanced power amp (29db). Never ever had CMRR issues. Dead silent. Great sounding. No many KW of phase angle controlled lighting here though 🙂
 
You guys are missing an important detail here.

If you design with BNCs, then nobody else can try or use nor can you use anyone else's equipment. Not without the imposition of an RCA-->BNC adapter.

And, if ur building THAT much for yourself, why use connectors at all? Solder directly!

_-_-
 
You guys are missing an important detail here.

If you design with BNCs, then nobody else can try or use nor can you use anyone else's equipment. Not without the imposition of an RCA-->BNC adapter.

And, if ur building THAT much for yourself, why use connectors at all? Solder directly!

_-_-

Not entirely true. He can come over my place and try it out on my stuff and vice versa. 😀

This is the same old adage of Betamax not working in VHS, sure RCA is the most common connector out there but BNC can be easily soldered into the same hole or a new hole can be made specifically for it.

If we're going to improve things in Audio then why not start off with BNC? its a physically superior connector (not everyone thinks this though) capable of using multiple types of Coax and is just as cheap as RCA connectors.

If we really wanted to be ritzy we would use N-Connectors but that is such a wide connector I'm not sure it would satisfy the needs of everyone like RCA does. At least with N connectors and such huge plugs you can use RG-8 RG-165 or RG-213, seen the size of RG-213 lately? The only problem with that is that Coax that thick isn't exactly the most common thing around worldwide. So I settled on high quality RG-6 Quad.

Its not like I'm going to go and take a stroll through a supernova anytime soon so that level of shielding isn't really necessary. RG-6 gave me a benchmark and the satisfaction in knowing that I'm doing the best I can for the signal.

Whats sillier to me is going to balanced XLR for everything because only one piece of equipment that I own has it and I know of no one around who does either. I also think that converting from RCA to BNC is a useful thing if you have someones CD player and they want to try it out on your system. I doubt you can easily convert from RCA to Balanced XLR without a lot more fiddling than simply plugging something in.

I use copper cored RG6 quad with BNC for everything and I'm sure I'm not the only one.
 
Last edited:
I second the notion that if designing from scratch, go balanced.

Depends what you're doing really, for most purposes balanced is utterly pointless, and phono is perfectly fine.

Impedance matching is a non issue at audio (unless you are the phone company), pretty much everything audio these days is voltage and not power tranfer driven.

Certainly a 'non-issue', as you almost always don't want impedance matching, it's a 'bad' thing 😀 (apart from RF of course).
 
Thanks for bringing us back to "earth", bear. The OP has not raised the issue of balanced lines anyway and from his previous posts, I don't think the high end commercial sources you need to use with them are a likely consideration.

I hear more guys admiring the techno look of BNC connectors rather than any benefit they confer but as far as problems with hot swapping RCA connectors goes, you won't get much improvement by using BNCs which can still break the shield before the centre pin when you disconnect.

As posted earlier, BNC or actually BNC cable impedances, apply to RF power circuits and the mismatch you might have at line level audio frequencies is of no consequence. RCA connectors with ordinary shielded leads will have a characteristic impedance that could be an issue too but since we don't discuss it or see it obsessed over by audio nuts, it isn't a problem - unless we want more O/T, of course. 😉
 
Certainly a 'non-issue', as you almost always don't want impedance matching, it's a 'bad' thing 😀 (apart from RF of course).

I'm not entirely sure that is true for Digital signals.

If there is RF generated in the Receiver chip or in the source then wouldn't we want it coupled properly to the next circuit where it can then be carefully moved to ground by a capacitor?

I'm thinking of using 3x independent RP-SMA cables with RG-316 for short 10cm or less interconnections between Receiver chips and DAC chips to carry I2S...

I've yet to finish it, still waiting on parts. But I have settled on using the Female RP-SMA edge connector for the Receiver side and the Male RP-SMA edge connector for the DAC side.

Not hard to tell that I like Coax.
 
Last edited:
Status
Not open for further replies.