power ratings

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
If an amplifier is quoted to supply 100W into 8ohm, 200W into 4ohm and 300W into 2ohm, how would the power change if it supplies 50W into 8ohm? Does it follow that it will double again into 4ohm so that's 100W, and possibly double again into 2ohm to give 200W since we start with a lower wattage?

And how much power do we really need for an average size bookshelf speaker in an average size living room listening to modern pop music at normal levels? Some hifi reviewers say we need hundreds of watts depending on the speaker load and others say 50watts is plenty.

Is this doubling of power output for a halving of the load, only necessary if the speaker load varies widely in value? What about speakers whose loads rise up to 30ohms somewhere?
 
Due to real world losses and inefficiencies most amplifiers won't double power into halving of the load below 8ohm, especially down at the 2ohm level. The supplied voltages are lower for lower power levels and thus currents are also smaller at lower power levels. This makes it a little easier to reach this theoretical doubling of power to occur but unless the amplifier as a rediculously robust design and construction it isn't likely to happen.

You list many subjective things in you question about required power. How efficient are the speakers? How big is an average living room (I'm willing to bet the typical living room varies quite a bit with region)? How loud is 'normal'? Too many variables for a straight answer.
 
It depends on what the manufacturer's specification is revealling.
Is it a guarantee that when powered with the nominal Mains supply voltage that every model gives at least the quoted power under the operating conditions stated?

Or does it mean that for all mains supply variations, every model will give at least that power output?

Or does it mean that the quoted power is typical for the model at that set of stated operating conditions?

Now to the various output powers into different loads.

8ohms and 8r0 are different in my book.
An amplifier that is capable of giving out 20Vac into an 8r0 load resulting in 2.5Aac of output current is a 50W amplifier into 8r0.

In my opinion the specification must also specify what typical speaker impedance that this same amplifier can drive.

Now to 4r0 and 4ohms.
That same 50W amplifier could give 19Vac into 4r0 and may also say it is suitable for driving 4ohms speaker. This would show as a 90W into 4r0 amplifier. This is exactly 180% of the power into the 8r0 load. This is a good guide that this amplifier could comfortably drive a 4ohm load.

Finally to 2r0, but not including 2ohms speaker capability.
It might give 17Vac into 2r0. That is equivalent to 144W into 2r0.
This time the power output change from 4r0 to 2r0 is only 160%.
I view that as a good guide that this amplifier is not comfortable in driving a 2ohms speaker.

Note, in the above comparisons I have used Vac into the specified loads.
20Vac, 19Vac, 17Vac.
I find this tells a lot more about the capabilities of an amplifier than quoting power.

Had the numbers come out @ 22Vac, 21.3Vac, 20.1Vac, 18Vac into 8r, 4r, 2r, 1r3 would reveal a quite different story for this different amplifier.
 
Last edited:
so are you saying that at lower power levels, doubling of power as load halves, is actually achieved less than at higher power? But if the amp delivers 100w at 8ohm and 150w at 4ohm, why cant it deliver 50w at 8ohm and double to 100w at 4ohm when it can happily deliver 150w?

yes there are many variables so let me put it this way. Under what circumstances would it be justified to use a rating of 150-300 watts into 8ohm?

and my third question was is this doubling of power output for a halving of the load, only necessary if the speaker load varies widely in value? What about speakers whose loads rise up to 30ohms somewhere?
 
Are we talking about different amps or the same amp?

If an amp is rated - and let us assume honestly - 100 watts into 8 ohms, then 50 watts into 8 ohms just means you are only pushing it half as hard. Like a 100 horsepower car engine, it can put out 100hp, but if you don;t push the gas pedal down as far, you get only 50hp.

You cannot assume ratings will multiply or divide in the real world. But understand it is the load drawing the power out, the amp doesn;t push the power onto the load. In solid state amps anyway, there is an output signal voltage. With no load, no power is produced. The lower impedance the load you hang onto it, the more power is produced.


If you have an amp of any rating that is puting 100 watts into a 4 ohm load, then swap the load for an 8 ohm one, if nothing else changes, then your amp would only be producing 50 watts. Like your car pulling a trailer, if it takes 100hp to pull a load down the road, removing half the load from the trailer will require then less power.

And the other way, if you have an amp producing 100 watts into an 8 ohm load, and leaving everything else the same, you then put a 4 ohm load there instead, the load will TRY to draw 200 watts from the amp. Whether the amp can handle that extra demand is a separate issue.

I am ignoring complicating factors like changing semiconductor voltage drops changing with current and things like that.
 
If you have an amp of any rating that is puting 100 watts into a 4 ohm load, then swap the load for an 8 ohm one, if nothing else changes, then your amp would only be producing 50 watts.
No.
The 8r0 load will draw more than half the power drawn by the 4r0 load.

It's that output voltage again.

If every amplifier were specified in AC Volts, it would make things much simpler. Just like line level can be stated in volts, but sometimes in dB with reference to some defined voltage.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.