For a FET I am using in an output stage, the Gate to Source voltage needs to be about 4 volts for proper biasing. I set the bias to something a little above 4. The FET is conducting about 1 amp of current at idle. The power supply is +/-40 volts. In class AB, there is 0 volts at the output at idle, so 40 drain to source.
Does this mean it will dissipate 40 watts at idle, or do I use the RDSon value to compute this (Power Dissppated = I^2 * RDSon)?
Does this mean it will dissipate 40 watts at idle, or do I use the RDSon value to compute this (Power Dissppated = I^2 * RDSon)?
Normally do Class AB fet stages dissipate this much static power at idle? If so, heat-sinking is the biggest must in the world.
+-40Vdc supply equals 80Vdc rail to rail.
1A passing as output bias from rail to rail (also equals zero output current) amounts to bias dissipation = 80W
1A passing as output bias from rail to rail (also equals zero output current) amounts to bias dissipation = 80W
I am just doing half the system for clarification sake. Yes, the entire system would be 80 watts. My question is, am I biasing too much into class A or is this normal?
- Status
- Not open for further replies.
- Home
- Amplifiers
- Solid State
- FET Static Power Dissipation