Procedure for checking Amplifier Damping Factor

This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
First off, thanks to all of you who responded to my previous post (see: info on amplifer damping factor). I can't actually say I'm totally clear on the subject, but I believe what I actually need to do is design and build the amp, and then measure the DF.
A little more explanation perhaps. This amp will be part of a Bass Guitar combo (speaker / amplifer). I'm now thinking that the best way to proceed is to build the amp and adjust the enclosure to take into consideration the DF of the amp. Any suggestions greatly appreciated. Also the "How To" of testing an amp for it's DF. I do have a lab full of test gear, should have anything required to test it, I hope! Thanks again for all the help!
AX tech editor
Joined 2002
Paid Member
Damping factor

Hi Dayveshome,

I'm sure you know that the DF is calculated from the amp's output impedance. The task thus reduces to measuring this impedance. You can just measure the output resistance by loading the amp with a standard 8 Ohms, measuring the output volts, then placing a parallel resistor (say another 8Ohms) in parallel to the first and again measuring the output volts. You have now two values: the additional output current due to the additional output load (which of course is Vout/8 Ohms) and the drop in Vout due to this extra load current. The drop is caused of course by the amp output resistance. According to Ohms law, the output resistance = Vout drop/ extra load current.
You can get much more fancy by trying to measure the (complex) output impedance rather than the resistance, but you get the point

Cheers, Jan Didden
Hi dayveshome

I do generally agree with the procedure suggested by janneman. It is even the simplest way for measuring DF of a single amp.
But do not measure the output voltage with the first resistor connected and the output voltage already just below clipping because you probably might include the voltage sag of the PSU when you connect the second resistor (which will cause your amp to clip at a lower output voltage into a lower load impedance). If a PSU would be ideal (or the load on the PSU is very constant as it is the case with a real class A amplifier !) then this wouldn't matter at all but with a "real-life" PSU I would make the measurement with low power only in order to be sure to exclusively measure the output resistance of the power amp.

In case of two amps (they would have to be wired accordingly, i.e. their output grounds have to be connected ! ) or a stereo amp there is an even simpler solution to measure DF: put a load resistor between the outputs of the amplifiers and feed only one of the channels. The ratio of the voltages over the load and the output of the UNDRIVEN amplifier is the damping factor. But one has to take care to do this at low power only since this will generate a lot of wasted power in the output stage of the amplifier under test (i.e. the undriven one).


This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.