• WARNING: Tube/Valve amplifiers use potentially LETHAL HIGH VOLTAGES.
    Building, troubleshooting and testing of these amplifiers should only be
    performed by someone who is thoroughly familiar with
    the safety precautions around high voltages.

Signal strength for SE amp testing?

Greetings Friends. I've been testing my single-ended project amps and have run across some questions about proper use. I'm using a Pyle USB ASIO interface, a dummy load made of thick film resistors wired to present 4- and 8-ohm loads, and an old Lenovo PC running REW.
IMG_20250526_103248706_HDR.jpg


1000007325.jpg

I have a DSO138 oscilloscope and a Klein TRMS-capable meter to directly measure across the dummy load, and a 22:1 voltage divider inside the plug going into the audio interface to limit that voltage to <100mV. This is calibrated in REW and verified with the 'scope.

Ok. I can run Freq sweeps and measure distortion and get pretty graphs. But I'm realizing that without a standard, the measurements are arbitrary. Testing an EL84 Triode amp at 2w output will give a much higher distortion profile than testing a similar amp wired in Pentode at 2 watts, so it's not the output power that I'm after, it's a standardized input signal that I need.

A friend is a guitar amp tech and he recently did an IG post about measuring Guitar amp output power, where he mentioned that the standard input signal for measuring guitar amps is 1Khz @ 500mV. Is there such a standard for Hi-Fi amps?

When using REW, the measurement level can be set in Volts, dbFS, dbu, or dbV. Running a frequency sweep at 4v, for example, causes an input signal to be applied to the DUT that will result in 4v at the output. Setting the level using dbFS results in a similar outcome, arriving at a set output level. Would dbu or dbV give a set input? Am I barking up the right tree here?

Thanks

w
 
The standard for testing amplifiers is usually 1 watt output. You adjust the input voltage to produce this level. To measure output wattage, you measure the peak-to-peak output, convert to RMS voltage, square the result and divide by the load impedance. For an 8 ohm load, 1 watt is around 2.8 volts RMS.

For typical tube amplifiers with normal input sensitivity, 200-400 mV of input signal will produce around 1 watt of output.

For a 2 watt amplifier, you'd probably want to measure at lower output, maybe 1/2 watt? Since at 1 watt your distortion is already going to be high... ;-)
 
  • Like
Reactions: tubetvr