Best way to implement thermal protection?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Please share your opinions and thoughts on the best way to implement thermal protection for an amplifier - techniques I have seen are:

1) cut AC power
2) turn off the current source loading the input LTP (for example Adcom)
3) turn on a fan
4) trigger a limiter on the input, reducing drive

1 and 2 are drastic but effective. 3 helps the symptom but doesn't address the underlying cause. 4 has the advantage of not stopping the party but could be ignored leading to long term damage.

I'm experimenting with shunting current away from the bases of the OPS drivers, similar to the SOA protection circuits discussed by Doug Self, but triggered by over temp rather than SOA constaints. It occurred to me that with ThermalTrak transistors it is really easy to monitor junction temp in the output stage.

What are your thoughts on the pros / cons of these approaches?
 
Monitor heatsink temperature and either mute the input or disconnect the output.

It's only needed though in cases of abuse, normal use it should not overheat, if it does the heatsinks are too small.

The point is to be able to construct an amplifier that wouldn't need huge heatsinks sized for worst case, and conservative thermal limits. Monitoring heatsink temp for bias and overtemp tells you the integral of the power for the last several minutes, not what's happening right now.

Instantaneous junction temp can be far above the heatsink temp. You could have a much smaller heatsink designed for "average" power, and even let it run fairly hot, knowing that by monitoring the junction temp you can keep the output stage safe from harm by turning the power down.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.