Typical gain for a power amp
Having just finnished an amp, i was wondering, what is the max allowable gain? My feedback resistors are 39k and 1k. Thats a gain of 39 right?
A.k.a. 32 dB.
The required amount of gain will obviously depend on required output power and signal level delivered to the power amp. Or seen the other way round, it determines input sensitivity.
As a rule of thumb, the more gain one stage has, the higher its contributions to noise and distortion will be. Many hi-fi amps has as much gain as 45 dB after the volume pot, and they're not easy to get dead quiet.
I've tried shedding a bit of light on that in a little blog entry.
Around 30dB is about right.
It does vary with the input sensitivity and the output required.
Lets take a fairly standard 100W Power Amp.
In order to produce 100W into 8 Ohms it will need to output approx 28VRMS.
If it is to be fed with a CD input (Typically 2VRMS)
It will need a gain of 20xlog(28/2) = 23dB
If it is to be fed with a line source of 100mV RMS
It will need a gain of 20xlog(28/0.1) = 49dB
You have a linear gain of 39 which is equivalent to 20xlog(39) = 32dB
With a CD input, assuming that the amp has the guts to produce it, you will get approximately 50W into 8 Ohms.
It doesn't matter if you have a 2Kw amplifier with a Gain of 32dB, it will still only output 50W with a 2VRMS input signal.
KatieandDad, your reasoning is A-OK but I think there is a typo at the end. The gain of 39x is too high for a CD input of 2VRMS, the amp will clip heavily (unless there is a vol pot of course).
The usual configuration for a power amp is non-inverting:
The gain is
As I said. If the amp has the guts to go that high.
If you are talking Pre-Amps then YES you are right.
Power Amps need much more gain in order to achieve their target outputs.
Hopefully my earlier post makes that clear.
For a pre-amp the reasoning is slightly different.
The maximum output voltage is marginally less than +Vcc to -Vcc.
Lets take a standard Op-Amp that is powered by +/- 15V.
Max Vout RMS is 10.6V RMS, probably slightly less.
In which case with a CD (2VRMS) signal the maximum gain before clipping is about 14dB or 5x.
100mV Line Input is 40dB or 100x.
No it doesn't, it contradicts itself, it states :
A) 100Wrms into 8 ohms with 2Vrms input needs a gain of 23dB
B) 2Vrms input with a gain of 32dB can only produce 50Wrms into 8ohms
Clear as mud ......
Power amplifier gain is usually around 1Vrms in for maximum output.
That gives you the "standard" gain of 28 (100W with 1V in 8ohm).
But some are about 3x times higher (10dB), 300mV in for full power.
This allows the use of a passive volume control for line inputs.
B) 32dB, can be considered a case of the above.
Sorry went the wrong way with Anti Log function.
100W RMS into 8 Ohms does indeed need 23dB Gain with 2V RMS signal.
2V RMS input into an amp with 32dB gain would give nearly 795W RMS if the amp had the guts to produce it.
I believe there is an industry standard of about 26 dB for audio power amps. It is like a sweet spot which balances output power, effective loop feedback to lower distortion and stability (no oscillation). For example the gain factor for Threshold S/150 is + 26.6 dB (no loop feedback). and for Adcom GFA-535 is +26.8 dB (loop feedback). The plus sign signifies no inversion of the signal phase.
|All times are GMT. The time now is 07:17 AM.|
vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2015 DragonByte Technologies Ltd.
Copyright ©1999-2015 diyAudio