Why does amp distort even when fed voltage well below is range?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Hi
I know two amps, MAX9744, class d 20W and TPA2016, class d 3W.

I had this conversation with a guy from woofer.com. He told me that TPA2016 is not a good amp as it has got high THD+N of 10%.
That got me demotivated, as I thought that would be a good amp to begin with as it has also AGC. But i dont know why he said that. Though I have not tested the amp.

But when i ordered MAX9744 from adafruit, it was exhibiting distortion at high volumes even when the supply voltage was well below the max range of 12v. I gave it 8V with a speaker rated at 25W 4Ohm, still there was distortion.

But i now have that MAX9744 modified to include a back to back diode BAV99 at both the input channels. Have the PCB files but have not tested, so am a little skeptical of it if there will be still distortion.

Thats why when I heard about TPA2016, it was a little setback, coz it has AGC.

So why did that happen with MAX9744?
And can you suggest some good class d amps apart from the tripath ones?(Class d, 20 w)

* Double post section removed by Moderator.*
 
An amp, any amp, distorts when you try to drive the output close to the supply voltage (ie. into severe clipping). It does not matter what the actual output capability of the amp is. In the example above, if you had given the max9744 12V instead of 8V and adjusted the input so that the output power would be the same then it would not have distorted even the slightest.

The guy from woofer.com is wrong. All amp chips are always rated at 10% THD+N because that is extreme clipping, so it tells the designer of the amp what the chip can output in a stress test. Any even half decent quality class D amp will be well under 0.2% THD+N at 80% of max RMS output.
 
Last edited:
In the example above, if you had given the max9744 12V instead of 8V and adjusted the input so that the output power would be the same then it would not have distorted even the slightest.

Adjusting the input means that putting it at full volume will distort even at 12V?
This is really confusing!

Pls help me clear this doubt for what can I do that will prevent distortion even at high volumes.
I once had a discussion with a Maxim Integrated guy, he advised me to put a back-back diode(BAV99) in both the input channels(b/w INL&AGND and INR&AGND). It will help. Just to be sure, Adafruit also said that will do.
So are they all playing?

Pls forgive my naivety, I am trying to become a pro. But sometimes all this puts me back to where I started.

Thanks.
 
A BAV99 just acts as a limiter. Clipping the signal before it reaches the amp. It is not a good solution. It's technically possible, and will work. It's just not a good solution.

Reducing the input so that you don't drive the amp into clipping is the only real solution. It's very very easy to do actually. Just reduce the feedback resistor value or increase the input resistor value or use the integrated digital volume control.
 
An amp, any amp, distorts when you try to drive the output close to the supply voltage (ie. into severe clipping). It does not matter what the actual output capability of the amp is. In the example above, if you had given the max9744 12V instead of 8V and adjusted the input so that the output power would be the same then it would not have distorted even the slightest.

The guy from woofer.com is wrong. All amp chips are always rated at 10% THD+N because that is extreme clipping, so it tells the designer of the amp what the chip can output in a stress test. Any even half decent quality class D amp will be well under 0.2% THD+N at 80% of max RMS output.

I got an adaptor rated for 12V 500mA, and even then I found distortion at high volumes.
 
500mA is not enough. That's why.

Assuming you have 8 ohm speakers then

12V^2/8Ohm = 18W/12V = 1.5A peak current draw per channel

So you need a power supply capable of 3A to avoid clipping.

That calculation is handy for quick assessment. Thanks.
But I guess it is not the only thing. Earlier I had 8V 500mA adapter. According to what you mentioned, and for 8 ohm speakers, that would be 2A.
But for 4 ohm it would be 4 A. So if I plan to make it a portable bluetooth powered it wont even last for an hour(assuming batt rated for 4AH, thats massive and costly)!

So i guess as you mentioned earlier to change the feedback or input resistor values is the way forward. Any advice similar to earlier for quick assessment?
 
Let my try to explain what going on in a different way.
Lets say you have 12V from the powersupply.. Remember that you have some voltage loss in the amplifier. What you're asking your amp to do is to put out 12V again.
On the output side: 12V minus 1V loss cant be 12V again.. You're missing some of the voltage and that creates the distortion you hear.
To get the power you need I suggest these input voltages (driving normal speakers): 21Vdc for 91db speakers and above that for party loudness. 36V to 50V usally give you the loudness you're expecting. Or get speakers with lets say 93db sensitivity or above.
I dont know of the amp you're using but if its analogue in, you could use a voltage divider to reduce the audio signal (voltage) going in to your amps. Hope this helps!
 
Let my try to explain what going on in a different way.
Lets say you have 12V from the powersupply.. Remember that you have some voltage loss in the amplifier. What you're asking your amp to do is to put out 12V again.
On the output side: 12V minus 1V loss cant be 12V again.. You're missing some of the voltage and that creates the distortion you hear.
To get the power you need I suggest these input voltages (driving normal speakers): 21Vdc for 91db speakers and above that for party loudness. 36V to 50V usally give you the loudness you're expecting. Or get speakers with lets say 93db sensitivity or above.
I dont know of the amp you're using but if its analogue in, you could use a voltage divider to reduce the audio signal (voltage) going in to your amps. Hope this helps!

I am using a MAX9744 class d amp. the PVcc is b/w 5 v to 12 v. Cant go beyond that. i think it might have to do with the amps from the adapter. i got 12 v 500ma.
 
Music consists of dips and peaks. The difference between the peaks and the average level is the 'crest factor'. Power consumption is obviously based on the average, not the peaks. 10-12dB is a normal crest factor in music. Since the average is much lower than the peak value, power consumption will naturally also be much lower.
 
I am using a MAX9744 class d amp. the PVcc is b/w 5 v to 12 v. Cant go beyond that.
i think it might have to do with the amps from the adapter. i got 12 v 500ma.

Hi,

Maximum voltage is 14V, i.e. the 13.8V of a car battery.

Maximum power is 4ohm loads with 12V, 20W+20W. 10% THD+N.
At 10W+10W 12V into 4ohm THD+N is a very respectable 0.04%.

In reality it does 13W per channel 12V into 4 ohms. (<0.5% THD+N)

As previously stated that needs a 12V 3A supply.

Maximum power into 4 ohms will drop with supply voltage.

I cannot see your problem if your using the device correctly,
which is by no means a given, there is lots to get wrong.

rgds, sreten.
 
Last edited:
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.