Hello,
I was hoping to get some advice on THD measurement conditions. For example, at which powers, at what frequencies and input amplitudes, at what temperature, for what duration, at what loading etc... I know 1kHz seems to be a standard frequency, 4R/8R loads are normally used.
Is there a standard somewhere that specifies all of these conditions?
Just to illustrate one problem I'm having. The distortion of an amplifier I am attempting to measure just gets lower and lower with increasing temperature. So at what temperature should I take the THD measurement?
If the amplifier has warmed up for 10 mins, this doesn't really mean much because as soon as a large amplitude sine wave is input for testing, the output stage heats up rapidly resulting in better measurements as time passes.
Thanks!
I was hoping to get some advice on THD measurement conditions. For example, at which powers, at what frequencies and input amplitudes, at what temperature, for what duration, at what loading etc... I know 1kHz seems to be a standard frequency, 4R/8R loads are normally used.
Is there a standard somewhere that specifies all of these conditions?
Just to illustrate one problem I'm having. The distortion of an amplifier I am attempting to measure just gets lower and lower with increasing temperature. So at what temperature should I take the THD measurement?
If the amplifier has warmed up for 10 mins, this doesn't really mean much because as soon as a large amplitude sine wave is input for testing, the output stage heats up rapidly resulting in better measurements as time passes.
Thanks!
The FTC used to specify thermal pre-conditioning of an amplifier before any measurements were made. Stereophile always followed that principle at least when John Atkinson was making the measurements : https://www.stereophile.com/content/ftc-proposes-eliminating-its-amplifier-rule
(mandatory) Standards in audio?
Lol
It's more wild west than the internet is.
Lol
It's more wild west than the internet is.
Any comments on this?
Test Methods of Measurement for Audio Amplifiers (ANSI/CTA-490-B)
https://shop.cta.tech/products/test-methods-of-measurement-for-audio-amplifiers#:~:text=Test Methods of Measurement for Audio Amplifiers (ANSI/CTA-490-B),receivers, and tuner/pre-amplifiers that use AC mains power.
Doesn't a 1kohm input impedance degrade SNR?
How about the 1W output reference level? Audio science review seems to use 5W.
Test Methods of Measurement for Audio Amplifiers (ANSI/CTA-490-B)
https://shop.cta.tech/products/test-methods-of-measurement-for-audio-amplifiers#:~:text=Test Methods of Measurement for Audio Amplifiers (ANSI/CTA-490-B),receivers, and tuner/pre-amplifiers that use AC mains power.
Doesn't a 1kohm input impedance degrade SNR?
How about the 1W output reference level? Audio science review seems to use 5W.
One hour warm-up, and 1W output into 8 ohms at 1kHz for starters.
Most testing does use 20Hz - 20kHz up to rated power.
If the distortion is decreasing with warm-up time, it needs more warm-up.
Most sources will have 100R to 1000R output impedance, but you can test
with very low source z to get the most favorable S/N.
Most testing does use 20Hz - 20kHz up to rated power.
If the distortion is decreasing with warm-up time, it needs more warm-up.
Most sources will have 100R to 1000R output impedance, but you can test
with very low source z to get the most favorable S/N.
Last edited:
The old (1974) FTC standard said to let the amp warm up at 1/3 the rated output power into 8 Ω, then run it for one minute (or was it five minutes?) at the full rated output power. Then test the amp.
The newer (2003ish if I recall correctly, which I might not) FTC standard used 1/8 of max power instead of 1/3.
Tom
The newer (2003ish if I recall correctly, which I might not) FTC standard used 1/8 of max power instead of 1/3.
Tom
IMO, distortion is best left to simulation. The graphs in my Class AB Biasing article show why a real-world measurement is not likely to hit a maximum.
Ed
Ed
Err, the customer doesn't buy the simulation, he buys the physical product. The simulation of that can never be perfect.
Often distortion is not modelled in simulation. The model for the LM3886 currently on TI's website, for example, does not have distortion included. If it is included in the device models, it's often a pretty crappy model. I wouldn't trust distortion numbers from a simulation beyond the "is it fundamentally broken" stage. I definitely would not trust the distortion figures reported by the simulator near clipping.
Tom
Tom
An amplifier can be designed in a manner in which the distortion is dominated by the class AB output stage. I did that for my amplifier. Then, the Ebers-Moll equation applies. The exponential relationship between current and voltage is the main source of non-linearity, and Ebers-Moll models that with extreme accuracy. Everything else is a second-order effect.
Fireanimal measured the Wolverine's THD and found near-perfect agreement with simulation.
ETA: EF3 distortion is not much affected by Hfe.
Ed
Fireanimal measured the Wolverine's THD and found near-perfect agreement with simulation.
ETA: EF3 distortion is not much affected by Hfe.
Ed
Last edited:
Right, so you put extra constraints on your design just to make it easier for the simulator.
The goal is to make the amplifier insensitive to device parameters. Being easy to analyze is a consequence.
Ed
Ed
Do you listen to your amp? Does it sound good to you? Are you looking to see if the sound matches some number that is important? If the numbers does not match your pleasure level, then are you going to trash it?
See the lower two units in my photo. 😉
This thread is now far enough off topic that I should re-iterate my point: that distortion varies widely with temperature is entirely expected, and computers can cover the design space much more thoroughly than a human making measurements.
Good night.
Ed
This thread is now far enough off topic that I should re-iterate my point: that distortion varies widely with temperature is entirely expected, and computers can cover the design space much more thoroughly than a human making measurements.
Good night.
Ed
One hour warm-up, and 1W output into 8 ohms at 1kHz for starters.
Most testing does use 20Hz - 20kHz up to rated power.
If the distortion is decreasing with warm-up time, it needs more warm-up.
Most sources will have 100R to 1000R output impedance, but you can test
with very low source z to get the most favorable S/N.
The old (1974) FTC standard said to let the amp warm up at 1/3 the rated output power into 8 Ω, then run it for one minute (or was it five minutes?) at the full rated output power. Then test the amp.
The newer (2003ish if I recall correctly, which I might not) FTC standard used 1/8 of max power instead of 1/3.
Tom
It sounds like a test that has very little to do with normal use. When a typical user wants to listen to music at a normal volume, he/she/it/they typically won't first let the amplifier play at an awfully loud volume for an hour.
I understand that the FTC wants to check whether the amplifier has adequate cooling, but why combine it with a distortion test like this? Or why not do two distortion tests, one cold and one hot?
Certainly the ANSI/CTA-490-B spec above specifies 1/8th of the rated power, not 1/3 for the preconditioning step.It sounds like a test that has very little to do with normal use. When a typical user wants to listen to music at a normal volume, he/she/it/they typically won't first let the amplifier play at an awfully loud volume for an hour.
1W for the distortion measurement itself.
I believe the 1/3 condition originated from the fact that it is close to worst-case for dissipation/temp.
Survive that for an hour and you're good to go.
But it may or may not be the best or worst distortion point, and is not meant for that.
Two different tests:
1 - do I survive one hour worst-case;
2 - What's the THD?
Jan
Survive that for an hour and you're good to go.
But it may or may not be the best or worst distortion point, and is not meant for that.
Two different tests:
1 - do I survive one hour worst-case;
2 - What's the THD?
Jan
The distortion does vary with temperature and can be expected to change and slowly settle after some time after switch-on and use.
So which measurement is the 'correct' one?
You can say: 'measure THD 1 minute after switch on'. Or whatever you agree on.
Since the period you listen to your amp is generally a lot longer than 'just after switch on', it makes sense to measure distortion after 'some time of use'.
One hour at 1/3 power is as good as any other condition.
The exact time and power is not important, but it would be nice if we all use the same time and power. It's an advanced concept called 'a standard' ;-).
There's often a lot of philosophy in audio 😎
Jan
So which measurement is the 'correct' one?
You can say: 'measure THD 1 minute after switch on'. Or whatever you agree on.
Since the period you listen to your amp is generally a lot longer than 'just after switch on', it makes sense to measure distortion after 'some time of use'.
One hour at 1/3 power is as good as any other condition.
The exact time and power is not important, but it would be nice if we all use the same time and power. It's an advanced concept called 'a standard' ;-).
There's often a lot of philosophy in audio 😎
Jan
- Home
- Design & Build
- Equipment & Tools
- Total Harmonic Distortion (THD) Measurement Conditions? Is There a Standard Available?