Understanding input sensitivity/gain/output power

Hello,

I just learned what input sensitivity and gain is and it only confused me more. I get that gain is the ration between input sensitivity and output voltage. Let me try and understand with an example. Lets say we have a computer that can output 0.5 VRMS on its line out with an output impedance of 250 ohm and a 8Ω speaker. Now if I get two amplifiers both of them with 26db of gain but one of them is rated for 500 watts and the other for 2000 watts. Would they both provide the speaker with the same amount of volts/wattage? ( I am using such high watt number so that we do not become current limited because I do understand that even if the output voltage is the same in both of them with no load when you connect them to a load maybe one doesnt have the current to supply that voltage)

some calculations i made :
Voltage gain (dB) = 20×log (Audio output voltage / Audio input voltage)
26=20log X (Vout/0,5) => Vout= 9.976312 V

V=I X R => I= 1.24704
P= V x I => P=12.44085 Watts

so 12w of output in both amps (if one amp was rated for 10watts i guess thats when we would have clipping) why am i getting more powerfull amps then? I should be getting better preamps so that I can have higher input voltage in my amp!!11!



I think i am missing something here.
 
Last edited:
Higher power amplifiers often have higher gain and/or sensitivity, for this reason.
They have to be able to be driven to full output by typical sources that would be used with them.

Many preamps and sources will put out at least 2Vrms these days, so it isn't a big problem.
Some weaker sources may indeed need a line stage preamp with gain, and 12dB to 20dB is typical.

Most really high power amplifiers are either bridged or balanced, so they will only need
half the input voltage (per phase) for full output, that they otherwise would.
 
Last edited:
Voltage gain (dB) = 20×log (Audio output voltage / Audio input voltage) I think i am missing something here.
This calculation is right, for voltage gain only.
Your computer I cannot recognise as a reference source neither the power examples as shown; voltage gain and power amplification are related yet different in result.
To tidy up: input voltage is amplified to the output voltage (the above proper equation). Then this output voltage is applied to a load, say a resistance of 8Ω. Apart from the required current, the output power is the result of voltage and drawn current. The "20*log" rule is exchanged with the 10*log rule (power vs voltage or current only). There is no equation possible to relate input voltage to output power. Even more, loudspeakers do not behave as 8Ω resistors but as very complex impedances, even generating power themselves in extreme conditions (depending what musical preference is on program) expecting the power amplifier to 'absorb' this by electromagnetic force induced current, 'perfect voltage sources' amplifiers are considered to do. Not so, alas.

I'm quite sure some fruitfull references may post here for you as a follow up, or dive in the various available resources on this platform and beyond.
 
Last edited:
Now if I get two amplifiers both of them with 26db of gain but one of them is rated for 500 watts and the other for 2000 watts. Would they both provide the speaker with the same amount of volts/wattage? ( I am using such high watt number so that we do not become current limited because I do understand that even if the output voltage is the same in both of them with no load when you connect them to a load maybe one doesnt have the current to supply that voltage)

I think i am missing something here.
What you are missing is that 500 W amps and 2000 W amps don't have the same gain, typically.
I own a PV-4c that has 200 W/ch @ 4 ohms @ .1% HD, 1 v full power sensitivity, gain of 29 db.
I own a PV-1.3k from the same era that has 650 W/ch @ 4 ohms .1% HD, 1 v full power sensitivity, 34 db gain.
So for the same input voltage, the bigger heavier amp puts out more volts & wattage than the smaller one.
The situation has been muddied since by watt ratings becoming mythical, something that can be sustained for milliseconds. Heat sinks & fans no longer count, power supply delivery no longer counts, only volts & amps out for a few milliseconds into a resistor, not a speaker, count anymore. For the advertised number. The USA FTC used to specify watt testing procedure, but they don't care anymore.
These amps I own were produced in the day (mid 90's) when Peavey watt ratings were RMS 24 hours a day 7 days a week.
That is why I buy & repair them, instead of buying some class D unicorn that will last weeks instead of decades.
Read the PA forum for some of the pro sound consultants opinions of current watt ratings. Even top of the line QSC's were reported to run extremely hot and last a year or two when run at rated watts, now.
 
Last edited:
Seems like i had completely misunderstood the hole case. I dont know much about amps or generally electronics, and what i do know its fragmented bits of info from one subject to another. I should have guessed by now how much the input signal maters. I just thought that high gain was something o be avoided as it brought more distortion, hence the various mods people do in many crappy class d amps to lower the gain and get better quality.

I am getting a bit out of subject now but i have a follow up question regarding the input signal and its attenuation.
If a volume knob all it does in simple attenuate the voltage of the signal then i get that an active pre amp would also boost the signal a bit. In that case do i have to consider if my preamps output is too strong for my amp to take without burning? If i have a power amp with a volume knob and a preamp how should i use the two knobs if i want the best quality at a given volume (as in loud) level.
This question derives from my preconception that is should always set the amps volume knob to max and control the volume with my source's knob. But in practice this isn't always the case and i find that i get a lot of noise when i do that,
 
First reduce the amplifier's volume knob until the noise is low enough.
Then make sure you can still get enough level using the preamp knob without
clipping the preamp output. If not, you will have to raise the amplifier knob
some more to reach a compromise.
 
Last edited:
Sounds logical and simple.

Could you explain what does this do, with the whole input voltage thing? If the amp has a fixed gain and a good power supply why would a volume know control the noise level even if there isnt any signal coming through? Would it be a bad amp design with a lot of internal noise or something similar?
 
The preamp line stage is after the preamp volume control, and can make noise.
So only the power amp control would reduce it. Sometimes system ground loops
will also cause hum/noise that the amplifier's control could reduce.
 
Last edited:
I am getting a bit out of subject now but i have a follow up question regarding the input signal and its attenuation.
If a volume knob all it does in simple attenuate the voltage of the signal then i get that an active pre amp would also boost the signal a bit. In that case do i have to consider if my preamps output is too strong for my amp to take without burning?
first line quality PA amps like Peavey, Crown, QSC, Yamaha, clamp the input to the power supply with diodes after a resistor. If the input is too hot, the signal is clipped top & bottom at the power supply voltage (+-15 typically)and sounds bad for that reason. If the input is connected to a 75 W guitar amp speaker output or something, the resistor before the clamp diodes burns up. A $.003 part.
Consumer amps usually lack this input protection feature.