We're all aware (I hope) that the supply voltage has a large effect on amplifier output voltage swing which translates to output power depending on the load. I experimented with some transformers I had laying around to see their effect on output power as I increased the voltage and VA rating. The first block of numbers is using a single TDA2040 audio power amp IC. Power was measured continuous sine wave before clipping into a non inductive 4 ohm load. The second block is a TDA2050 bridge amplifier into a non inductive 8 ohm load.
Transformers are center tapped, measured across the secondary. Full wave bridge was used into a split supply with one 6,800uf filter cap per rail. DC measurements were made across both rails. No load measurements were with the amplifier at idle (no output voltage) and loaded measurements were made at the measured maximum sinewave output just before clipping.
Doubling the supply voltage (very roughly) quadrupled the output power. Not surprising since (Vout)^2/Rload=Pw. I can't explain why the output slightly is more than quadrupled. Probably something to do with the driver stage with the IC.
Doubling up the VA of the 25.2 volt transformer gave 3.1 more watts from the single IC. However, the bridged amplifier loads the supply more heavily and I got 8.7 more watts when doubling up the va rating.
I conclude that using a transformer with a VA rating of at least three times the sum of all the max (non clipping) output power of each power amplifier on the PSU is a good estimate for designing the PSU. For example, if I were designing a two channel amplifier with 40w per channel, I would use a 240va transformer.
Transformers are center tapped, measured across the secondary. Full wave bridge was used into a split supply with one 6,800uf filter cap per rail. DC measurements were made across both rails. No load measurements were with the amplifier at idle (no output voltage) and loaded measurements were made at the measured maximum sinewave output just before clipping.
An externally hosted image should be here but it was not working when we last tested it.
Doubling the supply voltage (very roughly) quadrupled the output power. Not surprising since (Vout)^2/Rload=Pw. I can't explain why the output slightly is more than quadrupled. Probably something to do with the driver stage with the IC.
Doubling up the VA of the 25.2 volt transformer gave 3.1 more watts from the single IC. However, the bridged amplifier loads the supply more heavily and I got 8.7 more watts when doubling up the va rating.
I conclude that using a transformer with a VA rating of at least three times the sum of all the max (non clipping) output power of each power amplifier on the PSU is a good estimate for designing the PSU. For example, if I were designing a two channel amplifier with 40w per channel, I would use a 240va transformer.
I agree. If you use a larger transformer than actually required, power wise, the transformers regulation is better. I have never found any difference in the distortion levels except where the DC supply levels drop below the required value for the voltage output demanded. This is where the "Boot Strap" capacitor came into its own, adding an extra few volts to the final drive stage, especially with a fly-back generator used in older generation frame output stages in CRT displays.
thanks for the chart. it shows a noticeable difference in power with higher current rated devices. (now off to look for some 4amp transformers)
- Status
- Not open for further replies.