I haven't found any clear statements of audio industry standard input sensitivities for power amplifiers. My survey of various amps turns up various sensitivities, such as 1.35V rms for Douglas Self's gain-of-21 "Blameless power amplifier" (p. 311, "Audio Power Amplifier Design", 6th Ed.) which can produce 100W into 8 ohms.
Does the audio industry have any standards on input sensitivity for maximum power output?
Obviously, if there is one standard, then amplifier gain will depend on it. For example, Self's "Blameless" amplifier has a voltage gain of 21, whereas a 200W design would require a gain of nearly 30 for the 1.35V rms input.
I've found pre-amp output voltages up to 4V rms, which perhaps implies that power amp input sensitivities don't vary all that much from Self's 1.35V number.
Does the audio industry have any standards on input sensitivity for maximum power output?
Obviously, if there is one standard, then amplifier gain will depend on it. For example, Self's "Blameless" amplifier has a voltage gain of 21, whereas a 200W design would require a gain of nearly 30 for the 1.35V rms input.
I've found pre-amp output voltages up to 4V rms, which perhaps implies that power amp input sensitivities don't vary all that much from Self's 1.35V number.
There is not an audio industry standard. Most amps fall between 1 or 2 volts RMS for input sensitivity. This is dictated by a few design choices, one being noise. High gain will increase noise, so you don’t want to set the gain too high.
There is not an audio industry standard. Most amps fall between 1 or 2 volts RMS for input sensitivity. This is dictated by a few design choices, one being noise. High gain will increase noise, so you don’t want to set the gain too high.
Thank you.
What tradeoffs do people generally make beyond reducing noise?
I well understand the tradeoff between gain and stability, but assuming that good stability is achievable (i.e., stable loop gain), what other things do people worry about?
Commercial designs can be .7 or even .5 for full output.
Common input levels to obtain full power in "home" equipment generally is anywhere from 1 volt as stated to some amps I've seen that require 2.50 for full output.
Common input levels to obtain full power in "home" equipment generally is anywhere from 1 volt as stated to some amps I've seen that require 2.50 for full output.
The usual standard is 0db, around 1 volt RMS.
Yes, but long time ago, before cd players... now it's all over the place, 0.5 to 2 or more volts.
Yes, but long time ago, before cd players... now it's all over the place, 0.5 to 2 or more volts.
That's why audio mixers go into the red at 0db.
Its an industry standard.
Most preamps of the past or near past amplify between X10 or X20 for now days sources a good balance between juice and noise will be X6
Most preamps of the past or near past amplify between X10 or X20 for now days sources a good balance between juice and noise will be X6
I use x5 in my mixer designs for amplifying line level.
For stage use its good to have the mixer show red at 0db or some expensive speakers are going to get fried.
I haven't found any clear statements of audio industry standard input sensitivities for power amplifiers. My survey of various amps turns up various sensitivities, such as 1.35V rms for Douglas Self's gain-of-21 "Blameless power amplifier" (p. 311, "Audio Power Amplifier Design", 6th Ed.) which can produce 100W into 8 ohms.
Does the audio industry have any standards on input sensitivity for maximum power output?
Obviously, if there is one standard, then amplifier gain will depend on it. For example, Self's "Blameless" amplifier has a voltage gain of 21, whereas a 200W design would require a gain of nearly 30 for the 1.35V rms input.
I've found pre-amp output voltages up to 4V rms, which perhaps implies that power amp input sensitivities don't vary all that much from Self's 1.35V number.
I believe that either 0.707Vrms or 1Vrms is the consumer "standard". The problem is that manufacturers do not really hold to it. I have an Adcom amp from the 1990s with an input sensitivity of 1.75Vrms when in bridge mode.
With an analog preamp this was (in the past) never really a problem because most of them could put out 5Vrms or so. These days I have DAC outputs connected directly to amplifiers and I discovered that some of these only have 1Vrms output capability! I now make sure to choose my DAC a little more carefully in terms of its max Vout,rms.
If you reduce the gain of the amplifier and then require a higher voltage to reach the same output power you have also reduced the noise gain. My guess is that this explains upward creep in the input sensitivity.
The usual standard is 0db, around 1 volt RMS.
0db what? dbm, dbu, dbVu, dbspl? Sorry, pet peave. Db is just a ratio it almost always needs units.
0db what? dbm, dbu, dbVu, dbspl? Sorry, pet peave. Db is just a ratio it almost always needs units.
I should have put 0dbV. Its a long time since learned about db (38 years !)
What is dBV?
A logarithmic voltage ratio with a reference voltage of V0 = 1.0000 volt ≡ 0 dBV
- Status
- Not open for further replies.
- Home
- Amplifiers
- Solid State
- Power Amplifier Input Sensitivity