input sensitity

Status
Not open for further replies.
I've been investigating input sensitivity levels on amplifiers (mostly integrated amps). Talk about being all over the map! I can tell you what line-level is defined as (sort of). Then I can show you some examples that just don't seem to make sense.

Look at these input sensitivities:
Creek Audio EVO integrated amp : 415mV (rms or peak? specs don't say)
Creek Audio 5350 integrated amp : 559mV
Creak Audio Destiny integrated amp: 450mV
McIntosh MA6300 : 250 mV
Rotel RA-1062: 160mV
Denon PMA2000: 135 mV.

The manufacturers seem to indicate these values are for full scale output. Not even close to one another, are they? Further adding to the confusion, they don't indicate RMS or peak. I suspect it's mixed - at least that would make them somewhat comparable (6 dB, perhaps - although that's not very comforting).

Next, I looked at a number of DVD players, and they typically specified analog output of 2.0Vrms. Well that doesn't match very well to the input specs of the integrated amps, does it?



One A/V receiver I found has a user selectable reference for the analog input source (1V, 2V or 4V - again, not clear if it's rms or peak). Clearly, they solved this problem by adjusting the system for maximum dynamic range by input type. Nice idea.

BTW, stand-alone amps also seem to have varying input sensitivities.

Can anyone add some insight? Seems to be pretty universal problem. Why do all these manufactures have such divergent specs? Is there no standard? Other than the adjustable sensitivity method, how could anyone design equipment that is anything other than a compromise, under these conditions?

gene
 
If the power rating is defined in terms of rms, then the sensitivity is most likely to as well. Integrated amplifiers have an additional volume control stage, depending on the gain of that stage, you will have different sensitivities. Most common is additional gain of 10, some might have less.
 
Hi,

Input sensitivities on hi-fi amplifiers almost standardised to 150mV.

Then CD players came along and outputs of tape players and tuners
and the like started to increase to match CD, something like 500mV.

Generally not a major problem if you attenuate the CD players output.

Note a lot of preamps are now passive, the power amplifier itself
having high sensitivity, negating the need for any line stage gain.

🙂/sreten.
 
sreten said:
Hi,

Input sensitivities on hi-fi amplifiers almost standardised to 150mV.

Then CD players came along and outputs of tape players and tuners
and the like started to increase to match CD, something like 500mV.

🙂/sreten.

go back and read my notes on input sensitivity. I found 'real-world' integrated amps with sensitivity that are very inconsistent with one-another.

Where is the attenuation you talk about?
 
I think that is what sreten is saying; at one time the amps came close to having a standard, 150mV. But things never really settled down.

In the real world it "doesn't matter" because you can always turn down the volume - that's the attenuation.

Of course you usually want an amp to have more gain than you need, so that weak signals can be played loud. But that comes with trade offs, E.G. added noise. Most amps have far more gain then most of us ever need - that is also true in the pro audio world.

When I build an amp I try to keep the gain as low as practical - so often end up around 1V rms for full output. I like my normal listening to be at about 1:00 or 2:00 o'clock on the volume knob.
 
I got some feedback from a couple of manufactures. Also, I looked deeper into the McIntosh data sheet. Here's how it works:

1. They spec input sensitivity as the minimum voltage level (rms) to produce the rated output power. This would have to be with the gain (volume knob) at maximum.
2. They spec the maximum input before overload. This must be the largest input level, rms, the still produces the rated output power without distortion. This would have to be with the gain set to mimimum (0 dB?). I think the amp would have to also have -gain as well, so the volume can be reduced😎

Yeah, I've often read 1Vrms input - seems like a decent rule of thumb. Yet, many of the integrated amps I listed earlier spec the overload at 5Vrms or more! Pretty big range.
 
There're some other considerations for input sensitivity. Say you want to bi-amp. Thus you want all amps to have same gain. Thus a 1000W amp would have different sensitivity from a 100W amp.

Even when a mfg advertises 26dB gain, the actual gain may be different, for commercial reasons.

I'd say since the Tower of Babel, standardization is tough. Why can't the Chinese speak English so we understand what they say? In my area schools are offering classes of Chinese because they're 4 times more numerous than USA.
 
soongsc said:
I always though power amp sensitivities were more or less 1V.
😕 Some kits that I looked at were around 1~1.5V.

Hi,

If you want to build a power amplifier that can be used with a passive
pre-amplifier (i.e. miss out the line stage gain in a pre-amplifier) you
build a power amplifier with around 100mV to 200mV input sensitivity.

🙂/sreten.
 
gearheadgene said:
2. They spec the maximum input before overload. This must be the largest input level, rms, the still produces the rated output power without distortion. This would have to be with the gain set to minimum (0 dB?). I think the amp would have to also have -gain as well, so the volume can be reduced

OK, let's look at it his way. An amp has voltage gain. If you input 1 volt and you get 2 volts out, that's a gain of 2, otherwise known as 6dB. Usually it's more than that. So let’s take a gain of 10X or 20dB.

If your amp had no volume control, then 1 volt in would = 10 volts out. 2 volts in = 20 volts out, etc. Now let us imagine that the amp can supply 25V rms before clipping. About 78 watts. We will call that 25V our maximum power.

What is the amp's sensitivity? Right, 2.5V - for our maximum before clipping. 2.5V in, 25V out. 10X gain.

Now let's put a volume control in front of the amp. Probably a simple a potentiometer. That's just a voltage divider. We can cut the input voltage in half, or 1/3 or whatever we want. Thus it will take more input voltage to drive the amp to clipping. But does that really change the amp? No. All it changes is the input signal. But it's the idea of the volume control that confuses us.

So let's say you have cut the input signal in half with the volume control. It would now take 5V to drive the amp to clipping, or to look at it another way, 1 volt in now gives you 5V out - not 10. But the amp has not changed.

This must be the largest input level, rms, the still produces the rated output power without distortion. This would have to be with the gain set to minimum (0 dB?).

No, because what good what that do you? Think about it. With the right volume control you could have a 400V input and not clip the amp - only because you've divided the voltage down to below 2.5V. That's how it works.

What you really want to know is how much signal do you need to get the maximum clean output without a volume control? Any signal larger than that you can always attenuate - you can turn it down. Right?

An amplifier is rated at a certain input voltage (rms) for the maximum clean output. That's with the volume control wide open. Any higher input voltage can always be turned down. You start from max, and then go down.


koolkid731 said:
I'd say since the Tower of Babel, standardization is tough.

Funny, I was just reading Genesis chapter 11.... Tower of Babel.

Point well taken on the multi amp setup and gains. If you are running an active filter system, it's nice to be able to set your amps volumes all the same.
 
panomaniac said:

An amplifier is rated at a certain input voltage (rms) for the maximum clean output. That's with the volume control wide open. Any higher input voltage can always be turned down. You start from max, and then go down.

That's exactly what I said, from #1. We agree on the low-end input.

So what do you object to in #2? Maybe it's how the maximum input is handled? A volume control knob varies the gain from some max gain, to 0. That fact alone, doesn't imply any design methodology. For example, I could use an resistor divider at the input. I could also use an active, variable gain circuit. So when does the machine overload? A resistor divider should never overload it's next stage since I can, theoretically adjust down to meet the next stage input levels. Yet all the manufacturers seem to spec a maximum input level. That implies, to me, an active front-end.
 
Hey Gene,
sorry, I guess I didn't understand your post. I thought you were talking about negative gain or measuring max signal at max attenuation or something.

Amp sensitivity is usually measured at full volume, if there is a volume control. Sensivity is given at full volume, because you can always turn it down (as you noted.) So with the volume control at maximum, the amp becomes equal to a simple power amp with no volume control.

I'm not sure what you're getting at with the active front end. Yes, there is often a buffer stage on the input. Is that what you mean?

I have seen amps where you can overload the front end before the output stage, but they aren't too common.

Please let me know if I haven't understood.
 
Hi,

The maximum input level causes clipping of the output, not the input stage.

A passive pre-amplifier has theorectically very high headroom, but is a pain to
use at high attenuation levels as its very coarse at the pots initial rotation range.

🙂/sreten.
 
As this thread progressed, I got the bigger picture. That's why it's good to talk this stuff out.

What I was saying about the active front-end is a guess. Because, as spec'd by these guys, the maximum input is X. If the front-end is just an resistor divider - it'll never over load the active stuff to follow. Just increase the divider ratio, as we've already covered. Something in the design is overloading somewhere, otherwise these guys won't specify a maximum. My guess is the front-end saturates - that's all.

Sreten,
You may be right. But unless we have intimite design knowledge of the unit, it's impossible to say where the distortion comes from. All we have to go on are the 'guzintas' and 'gozoutas' :smash:
 
It's the output stage, as sreten says. Almost always.

There are some amps that will clip in stages before the output does, but they aren't common - and I wouldn't call a design like that "good."

So for practicle reasons were talking about the output stage. How many volts at the input before the output clips/distorts? The other stages have more headroom, so we don't worry much about them.
 
Status
Not open for further replies.