Compare power @ 1k Hz vs. wide band?

Do you have an example of the exact terminology used in each case?

The industry standard should be to state the continuous RMS power output of a single sine wave (e.g. 1kHz sine wave). Going one step further, the manufacturer may also state the total harmonic distortion of the sine wave during the measurement e.g. 80Wrms @ 0.1% THD or 150Wrms @ 10% THD. The higher the distortion (clipping) allowed, the more power output will be achieved, therefore if one manufacturer only states output at gross distortion levels (e.g. 10% THD) you won't be able to make a direct comparison with another manufacturer who only states output at low distortion (<1% THD).

Any other test method of measuring power, such as using multiple tones, wideband noise, bursts etc would be done to be intentionally deceptive, by producing an power output figure which is significantly higher than the continuous RMS sine wave method.

This is in contrast to loudspeaker drivers which are typically rated for power using wideband noise, as they do not suddenly clip as amplifiers do, and their power handling may vary grossly versus frequency (their power handling diminishes at low frequencies and at mechanical resonances, where mechanical limits are reached), which a solid state amplifier typically does not.
 
Last edited:
They're probably the same measurement in reality. The first spec doesn't specify the test frequency. The second does.

The fact that the test frequency is not specified should not be interpreted as "this applies across the audio band". Maybe they test at 20 kHz. If so, it would say nothing about whether the power supply in the amp sags on the peaks at 20 Hz.

A THD+N vs frequency graph would tell you more information.

Tom
 
I thought that most of time OEMs specify "20 to 20k Hz," but upon looking closely to post here, I did not quickly find that metric.

Let's suppose one OEM specifies 20-20k and the other specifies "@ 1k Hz." How would you compare those two if both were 75W?
 
Those are two independent performance metrics. A manufacturer may provide one, both or neither of those metrics in their specification sheet.

The former (20Hz-20kHz) specifies the frequency response of the amplifier - the range of frequencies over which the amplification is constant (or near constant). At low frequencies the amplifier's frequency response may rolloff due to an AC-coupling capacitor that prevents DC components from the source being amplified. At high frequencies the frequency rolls off, often on purpose to prevent the amplifier becoming unstable due to a circuit which uses feedback. These roll-offs are typically outside the audio band (below 20Hz, above 20kHz) and are therefore inaudible.

The latter (1kHz) specifies the test frequency at which they performed a power and/or distortion measurement.

It's uncommon for a solid state amplifier not to provide full output power (or close to it) over its entire frequency response

If a spec sheet says:
Frequency Response: 20Hz-20kHz
Power Output: 80Wrms @ 1kHz, 0.1% THD

You'd expect the amplifier to output 80Wrms (or close to that) at any frequency between 20Hz and 20kHz. At 1kHz the distortion will be 0.1% THD when outputting 80Wrms, and probably lower distortion at lower power output levels. At say 50Hz or 10kHz we don't know if the distortion differs from the stated 0.1% @ 1kHz, since they didn't provide us extra info about distortion at lower and higher frequencies.
 
Last edited:
Standard audio amps provide basically the same maximum power (limited essentially by voltage clipping) across the audio band (although some may not be able to handle full power for very long at 20kHz due to the Zobel network overheating.)


The power output is defined by the output impedance the amp can drive (normally assumed to be either 8, 4 or 2 ohms resistive) and the voltage swing at clipping - frequency isn't usually relevant.
 
Most OEM state wide range power. One or more OEMs from Europe specify power @ only 1k Hz.

For SS amps, is there a ratio that accurately compares the two specs?
No.

But there should not be a great difference, probably 10 to 20% which is inaudible.

"25 Watts rms @ 8 ohms, 50 Watts rms @ 4 ohms"

vs.

Power Output (Stereo) : 75 Watts per channel at 1KHz into 8 Ohms
The second one is way louder than the top one.

Let's suppose one OEM specifies 20-20k and the other specifies "@ 1k Hz." How would you compare those two if both were 75W?
The first one may be *a little* louder than the second one.