Amplifier Testing Procedures; Bandwidth and THD

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Hi everyone,

I was wondering that the procedures are for testing amplifiers.

Specifically Bandwidth and THD tests.

For bandwidth testing I am wondering if most manufacturers ignore THD to get their -3db or -1db points since distortion increases at the the frequency extremes and just look at the roll off of the amplifier.

Or do most keep the THD at some fixed level and roll of the input to keep the THD at the desired level thus obtaining the -3db points?

Also What importance does are super low THD measurements?
 
I have seen many suggest the bandwidth is tested at 1W of output level.
2Vac for a 4ohms rated amplifier and 2.83Vac for an 8ohms amplifier.

Those manufacturers that major on low distortion conveniently ignore any distortion above a 1kHz signal.
The 5kHz, 10kHz and 20kHz distortion numbers may well be revealing.
10k+20kHz IMD similarly should be revealing of problems.
 
There is another spec, the power bandwidth. The max power is where the output has 1% distortion (older tube amplifiers 3%) at 1 kHz on a dummy load. Then keeping the input signal level constant we find the half-power (-3 dB) low and high cutoff frequency. Distortion ususally increases at these extremes, but we don't care.
 
I think one measures the output distortion and keeps that constant @ 0.1%, as you increase the reproduced frequency.
When you get to the frequency where power output is down to half the 1kHz power, both at 0.1% distortion, you have found the half power bandwidth.
The amplifier gain at the half power bandwidth frequency will be substantially the same as the gain @ 1kHz. Thus the input signal @ half power bandwidth will be ~71% of the full power signal @ 1kHz. Maybe 73% to 75% if the frequency response has fallen off a bit.
 
Trying to clarify this thread a bit, Bandwidth can be measured at any constant input level for which the output of your Amplifier remains linear (doesn't clip). However you may have different results running at full output voltage to lower levels as a result of things such as slew rate limitations for example. The best way to resolve this is to specify your output level, for example when THD+N is at a defined level at 1 KHz, and measure the frequencies for the -XdB (Voltage) reductions at the output with this same input level. This then will be 'Full Power Bandwidth'. So long as the parameters of measurements are given; then other in/out levels or attenuation at given frequencies can be used. I do wish there could be a Standard for this; but different manufacturers quote different figures probably to 'massage' their sales figures...... Mik
 
Last edited:
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.