It seems as though all amplifiers perform best when they are at driven at near the maximum of their power range. An amp that is 200w at 8ohms for example will likely produce a lowest distortion signal close to that 200w point, beyond which the distortion spikes at 200w.
Is it possible to establish stasis where an amplifier is constantly operating at its near maximum output and lowest distortion point regardless of the input signal and volume? The methodology I would envision is that any excess power that would push the amp to clipping is in a sense "drained." This is obviously inefficient and a waste of electricity. Given this, perhaps battery operation would be an interesting solution, where excess power would be used in a regenerative capacity to charge the battery which powers the amplifier.
Is it possible to establish stasis where an amplifier is constantly operating at its near maximum output and lowest distortion point regardless of the input signal and volume? The methodology I would envision is that any excess power that would push the amp to clipping is in a sense "drained." This is obviously inefficient and a waste of electricity. Given this, perhaps battery operation would be an interesting solution, where excess power would be used in a regenerative capacity to charge the battery which powers the amplifier.
The distortion rising at low signal levels is not correct, except for poor designs.
Also, simple measurements include the noise, which at low amplifier output levels
becomes larger relative to the distortion magnitude, and can raise the apparent measured distortion.
To have acceptable dynamic range, the average signal output level should be at least -20dB below clipping.
Also, simple measurements include the noise, which at low amplifier output levels
becomes larger relative to the distortion magnitude, and can raise the apparent measured distortion.
To have acceptable dynamic range, the average signal output level should be at least -20dB below clipping.
Last edited: