Total Harmonic Distortion (THD) Measurement Conditions? Is There a Standard Available?

Hello,
I was hoping to get some advice on THD measurement conditions. For example, at which powers, at what frequencies and input amplitudes, at what temperature, for what duration, at what loading etc... I know 1kHz seems to be a standard frequency, 4R/8R loads are normally used.

Is there a standard somewhere that specifies all of these conditions?

Just to illustrate one problem I'm having. The distortion of an amplifier I am attempting to measure just gets lower and lower with increasing temperature. So at what temperature should I take the THD measurement?

If the amplifier has warmed up for 10 mins, this doesn't really mean much because as soon as a large amplitude sine wave is input for testing, the output stage heats up rapidly resulting in better measurements as time passes.

Thanks!
 
Moderator
Joined 2011
One hour warm-up, and 1W output into 8 ohms at 1kHz for starters.
Most testing does use 20Hz - 20kHz up to rated power.

If the distortion is decreasing with warm-up time, it needs more warm-up.
Most sources will have 100R to 1000R output impedance, but you can test
with very low source z to get the most favorable S/N.
 
Last edited:
Neurochrome.com
Joined 2009
Paid Member
The old (1974) FTC standard said to let the amp warm up at 1/3 the rated output power into 8 Ω, then run it for one minute (or was it five minutes?) at the full rated output power. Then test the amp.

The newer (2003ish if I recall correctly, which I might not) FTC standard used 1/8 of max power instead of 1/3.

Tom
 
Neurochrome.com
Joined 2009
Paid Member
Often distortion is not modelled in simulation. The model for the LM3886 currently on TI's website, for example, does not have distortion included. If it is included in the device models, it's often a pretty crappy model. I wouldn't trust distortion numbers from a simulation beyond the "is it fundamentally broken" stage. I definitely would not trust the distortion figures reported by the simulator near clipping.

Tom
 
  • Like
Reactions: 3 users
An amplifier can be designed in a manner in which the distortion is dominated by the class AB output stage. I did that for my amplifier. Then, the Ebers-Moll equation applies. The exponential relationship between current and voltage is the main source of non-linearity, and Ebers-Moll models that with extreme accuracy. Everything else is a second-order effect.

Fireanimal measured the Wolverine's THD and found near-perfect agreement with simulation.

ETA: EF3 distortion is not much affected by Hfe.
Ed
 
Last edited:
One hour warm-up, and 1W output into 8 ohms at 1kHz for starters.
Most testing does use 20Hz - 20kHz up to rated power.

If the distortion is decreasing with warm-up time, it needs more warm-up.
Most sources will have 100R to 1000R output impedance, but you can test
with very low source z to get the most favorable S/N.

The old (1974) FTC standard said to let the amp warm up at 1/3 the rated output power into 8 Ω, then run it for one minute (or was it five minutes?) at the full rated output power. Then test the amp.

The newer (2003ish if I recall correctly, which I might not) FTC standard used 1/8 of max power instead of 1/3.

Tom

It sounds like a test that has very little to do with normal use. When a typical user wants to listen to music at a normal volume, he/she/it/they typically won't first let the amplifier play at an awfully loud volume for an hour.

I understand that the FTC wants to check whether the amplifier has adequate cooling, but why combine it with a distortion test like this? Or why not do two distortion tests, one cold and one hot?
 
It sounds like a test that has very little to do with normal use. When a typical user wants to listen to music at a normal volume, he/she/it/they typically won't first let the amplifier play at an awfully loud volume for an hour.
Certainly the ANSI/CTA-490-B spec above specifies 1/8th of the rated power, not 1/3 for the preconditioning step.

1W for the distortion measurement itself.
 
AX tech editor
Joined 2002
Paid Member
I believe the 1/3 condition originated from the fact that it is close to worst-case for dissipation/temp.
Survive that for an hour and you're good to go.
But it may or may not be the best or worst distortion point, and is not meant for that.
Two different tests:
1 - do I survive one hour worst-case;
2 - What's the THD?

Jan
 
AX tech editor
Joined 2002
Paid Member
The distortion does vary with temperature and can be expected to change and slowly settle after some time after switch-on and use.
So which measurement is the 'correct' one?
You can say: 'measure THD 1 minute after switch on'. Or whatever you agree on.
Since the period you listen to your amp is generally a lot longer than 'just after switch on', it makes sense to measure distortion after 'some time of use'.
One hour at 1/3 power is as good as any other condition.
The exact time and power is not important, but it would be nice if we all use the same time and power. It's an advanced concept called 'a standard' ;-).

There's often a lot of philosophy in audio :cool:

Jan
 
  • Like
Reactions: 2 users