line level

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
can someone give me a definitive definition of line-level? I need to nail it down, but can't find any agreement amonst the various web-sites. For example, I see one place says that consumer audio device have line level of -10 dBV (0.316 Vrms), and professional audio is around +4 dBu (1.229 Vrms). Yet another place quotes the range from -10 dBu to +30dBu, which seems totally excessive.

Any guru's of the line-level out there who can straighten me out?

gene:smash:
 
That is right
0dB = 0.775Vrms
When it comes to microphone amps, recording and sound studios,
this reference might be in use.

When comes to CD players and power amplifiers inputs
it is not used much.

CD output standard is 2.0 Vrms for highest volume output.
Most power amplifiers need like 1.0 Vrms to produce max power output.

For some sound sources that can not produce an output level of 1.0 Vrms
we use preamplifiers, with a gain like 4-10 ( +12dB -> +20dB )
to adjust to power amplifiers.
 
Ex-Moderator
Joined 2003
Actually, 0.775V (into any impedance) is called 0dBu in order to distinguish it from the older standard of 0dBm which claimed to be related to 1mW into 600 Ohm (hence the "m"), I say "claimed" because genuinely measuring power is quite tricky and what was actually done was to measure the voltage across a 600 Ohm resistor (at the time sources were 600 Ohm and loads were 600 Ohm, wasting 6dB of signal).

OdBu is used by European broadcasters etc as a reference level and it is agreed that programme peaks will be controlled to reach a maximum of +8dBu (Germany allows for +9dBu). Interestingly, +8dBu = 2VRMS.
 
Hi,

it has been a general concensus for audio equipment that line level = 150mV.

This is the sort of input sensitivity you'd get on a power amplifier built for
a passive pre-amp or the PA in an integrated amplifier with passive pre-amp.

150mV is also pretty much the standard input sensitivity of pre-amplifiers.

However this has been creeping up with the advent of CD and many
tuners, recorders etc produce around 400mV to 500mV to bring the
apparent volume nearer to CD levels.

There is no definitive line level - but for audio -10dB is as good as any.

:) /sreten.
 
Ex-Moderator
Joined 2003
moamps said:
?
All professional equipments at my radio have +4dBm nominal level.

Interesting. My information comes from when I was at the BBC's Communications Department and I dealt with analogue music circuits from within the UK and abroad via the Eurovision network. Of course, now that analogue music circuits no longer exist, dBu etc is less important than dBFS.
 
oh boy :xeye: , well just as in my research, you guys are all over the map too. But, how would one design around such a loose spec? OK, maybe I need to do some more research and find out what various types of equipment are output at "line-level". Then compile a list and see where it stands.

Any of you have some real data to help start the list? Maybe CD outputs, DVD, tuners (tube and solid state), etc. Lineup stated most CD's output 2Vrms, so that's a start!

gene
 
Ex-Moderator
Joined 2003
That's right, a "Red Book" CD player can produce an absolute maximum output of 2VRMS, and that's what I design for. Tuners have a nasty habit of being much lower level - 150mV is likely. Decide what sources you are going to use and design to match them...
 
EC8010 said:
That's right, a "Red Book" CD player can produce an absolute maximum output of 2VRMS, and that's what I design for. Tuners have a nasty habit of being much lower level - 150mV is likely. Decide what sources you are going to use and design to match them...


What's "red book" - I'm not familiar with that. Thanks for the info though! Yes, I agree about designing to approprate sources. That said, do all tuners run in that same range? Or perhaps some are higher?
 
EC8010 said:
Actually, 0.775V (into any impedance) is called 0dBu in order to distinguish it from the older standard of 0dBm which claimed to be related to 1mW into 600 Ohm (hence the "m"), I say "claimed" because genuinely measuring power is quite tricky and what was actually done was to measure the voltage across a 600 Ohm resistor (at the time sources were 600 Ohm and loads were 600 Ohm, wasting 6dB of signal).

Maybe I have misunderstood the dBm unit, but AFAIK the 1 mW into 600 Ohms was only used to derive the defining voltage 775 mV. The unit dBm was thus actually defined as a voltage so 775 mV is 0 dBm regardless of the impedance and power.
 
Ex-Moderator
Joined 2002
gearheadgene said:
...By approriately setting the gain from the line to the A/D input, the maximum dynamic range of the A/D is used ...

Sound thinking. I would allow for a swing of 5V, and incorporate a clipping LED or level meter, and and input buffer with gain control. You will never get optimal performance with differing sources otherwise.
 
Ex-Moderator
Joined 2003
Christer said:
Maybe I have misunderstood the dBm unit, but AFAIK the 1 mW into 600 Ohms was only used to derive the defining voltage 775 mV. The unit dBm was thus actually defined as a voltage so 775 mV is 0 dBm regardless of the impedance and power.

No. dBm specifies 1mW into 600 Ohms (for audio). The fact that actually measuring 1mW is difficult was not the problem of the standard. If you use audio dBm you are explicitly stating 600 Ohms. If you want to use the 0.775V voltage and explicitly ignore impedance, you use dBu.

I'm sorry to be pedantic about this but I've had to suffer test gear labelled in dBm when it should have been dBu, and conversely to teach students how to correctly use test equipment that genuinely could source and terminate in 600 Ohms. The "Audio Measurement Handbook" by Bob Metzler (of Audio Precision) goes into further detail.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.