Clarifying Professional Line Level (+4 dBu): Is 1.23 VRMS the Differential Voltage Between Balanced Connections?

Hi everyone,

I have a small doubt regarding professional line level standards, and I was hoping someone could help clarify this for me.

I understand that +4 dBu is the standard for professional line level, which corresponds to 1.23 Vrms. However, I'm a bit confused about how this applies to balanced connections.

Specifically:
  • Is the 1.23 Vrms the differential voltage between the hot and cold lines in a balanced connection?
  • Or does it mean that each line (hot and cold) carries 1.23 Vrms individually, making the differential voltage 2.46 Vrms?
I’ve also heard that in some cases, the voltage might be split, with 0.615 Vrms on the hot line and 0.615 Vrms on the cold line, but I’m not sure if this is accurate.

Could someone explain how the 1.23 Vrms relates to balanced connections? Is it per line or the total differential voltage?

Thanks in advance for your help!
 
It's the differential voltage. Typically the hot and cold wire transmit opposite polarity; the amplitude of each being half the differential. However, balanced refers to both signal wires having equal source impedance, which means that it is also possible to have no signal on cold and full signal on hot.
Simon
 
As meantioned above, In a balanced line circuit it's the differential voltage between the two wires. If the signals are ground referenced, each wire could have half that voltage, but a balaned circuit doesn't depend on ground referenced voltages. The source could be a transformer with no ground reference, and the circuit still works. But if you then tried to measure the voltage on one wire relative to ground you'd see zero. The circuit also works in the "balanced impedance" method, where one wire has the full voltage on it, the other zero volts, but both have identical impedances.

+4dBu comes from the old +4dBm standard, which is +4dB relative to 1mW into 600 ohms. But since a VU meter is a volt meter, not a power meter, it's really reading the voltage that would be required to product 1mW in a 600 ohm load. To complicate matters, the VU meter itself is a 3500 ohm impedance with a 3900 ohm build-out resistor, for a combined impedance of 7500 ohms, a "bridging load" to a 600 ohm line. So while the meter is calibrated for 0dBm, the impedance and build out resistor result in a -4dB reduction in sensitivity, so the end result is 0VU=the voltage required for +4dB over 1mW into 600 ohms, 1.23Vrms, which now means +4dBu. One note, it's not a true RMS meter, its an average meter only calibrated to RMS voltages of sine waves. +4dBm was an accepted standard in the recording industry, but broadcast systems, in particular those driving circuits leased from telephone companies, used +8dBm. That +8dBm standard is long gone, as are leased broadcast lines.