Estimating amplifier wattage at lower voltage

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Member
Joined 2013
Paid Member
Might be a novice question, but Google has not helped me, so I’m posting for the genius’ here.
Is there a standard practice for de-rating an amplifier for a low age supply voltage.
For instance - an amplifier is rated at. 150w RMS for 8r at +/-56Vdc, but I have a +/-38Vdc PSU. Can I take the 150/56 and then multiply by 40 (~100watts) to get a ballpark estimate? Or is that completely wrong thinking?
Additionally, how much safety margin is usually built in? Say an amplifier says +/-50Vdc, would +/-53Vdc be within margin for safety? Is there a rule of thumb, such as a certain parts voltage rating to verify?
Thanks in advance - hopefully a conversation will develop to increase my knowledge.
 
Moderator
Joined 2011
Might be a novice question, but Google has not helped me, so I’m posting for the genius’ here.
Is there a standard practice for de-rating an amplifier for a low age supply voltage.
For instance - an amplifier is rated at. 150w RMS for 8r at +/-56Vdc, but I have a +/-38Vdc PSU.

The peak output voltage is about equal to the positive supply voltage.
The maximum power would be approximately V(peak)^2 / 2R
so the scale factor would be (Vpeaklow/Vpeakhigh)^2.

For your example, the scale factor = (38/56)^2 = 0.46, so 0.46 x 150W = 69W.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.