Power Amplifier Input Sensitivity

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
I haven't found any clear statements of audio industry standard input sensitivities for power amplifiers. My survey of various amps turns up various sensitivities, such as 1.35V rms for Douglas Self's gain-of-21 "Blameless power amplifier" (p. 311, "Audio Power Amplifier Design", 6th Ed.) which can produce 100W into 8 ohms.


Does the audio industry have any standards on input sensitivity for maximum power output?


Obviously, if there is one standard, then amplifier gain will depend on it. For example, Self's "Blameless" amplifier has a voltage gain of 21, whereas a 200W design would require a gain of nearly 30 for the 1.35V rms input.


I've found pre-amp output voltages up to 4V rms, which perhaps implies that power amp input sensitivities don't vary all that much from Self's 1.35V number.
 
There is not an audio industry standard. Most amps fall between 1 or 2 volts RMS for input sensitivity. This is dictated by a few design choices, one being noise. High gain will increase noise, so you don’t want to set the gain too high.


Thank you.


What tradeoffs do people generally make beyond reducing noise?


I well understand the tradeoff between gain and stability, but assuming that good stability is achievable (i.e., stable loop gain), what other things do people worry about?
 
I haven't found any clear statements of audio industry standard input sensitivities for power amplifiers. My survey of various amps turns up various sensitivities, such as 1.35V rms for Douglas Self's gain-of-21 "Blameless power amplifier" (p. 311, "Audio Power Amplifier Design", 6th Ed.) which can produce 100W into 8 ohms.


Does the audio industry have any standards on input sensitivity for maximum power output?


Obviously, if there is one standard, then amplifier gain will depend on it. For example, Self's "Blameless" amplifier has a voltage gain of 21, whereas a 200W design would require a gain of nearly 30 for the 1.35V rms input.


I've found pre-amp output voltages up to 4V rms, which perhaps implies that power amp input sensitivities don't vary all that much from Self's 1.35V number.

I believe that either 0.707Vrms or 1Vrms is the consumer "standard". The problem is that manufacturers do not really hold to it. I have an Adcom amp from the 1990s with an input sensitivity of 1.75Vrms when in bridge mode.

With an analog preamp this was (in the past) never really a problem because most of them could put out 5Vrms or so. These days I have DAC outputs connected directly to amplifiers and I discovered that some of these only have 1Vrms output capability! I now make sure to choose my DAC a little more carefully in terms of its max Vout,rms.

If you reduce the gain of the amplifier and then require a higher voltage to reach the same output power you have also reduced the noise gain. My guess is that this explains upward creep in the input sensitivity.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.