Allowable impedance of mosfet driver stage

This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
A mosfet output stage has an input impedance that at DC to low frequencies is in the range of maybe 10,000 Megohm plus. Of course as you start to drive it with higher frequencies and higher slew rates, a bit of effort is required to charge and discharge the internal mosfet capacitances.

I have been messing around with a paraphase driver / phase splitter that simply uses a 470 ohm above and below a smallish mosfet, the whole thing running off a 100v rail with 25v dropped across each resistor so there is about 50mA flowing. As the fet is turning on the drive current is only really limited by the fet itself, but as it is turning off the drive current depends on the 470 ohm resistors. The required output is +/ - 20v peak per each output.
The output fets, (STY34NB50) running in **source follower** mode have a Ciss = 5900 pF, Coss = 880 pF, Crss = 80 pF. Are the 470 ohm resistors low enough or is there anything much to be gained by going lower?

Joined 2002
Paid Member
Do you use a circuit simulator? I think that might be the best way to try to answer this, because I think that a reasonable analytical answer might have to make a lot of assumptions to give insight.

Also, I think that only the noninverting side will behave asymmetrically as you suggest; the drain of the driver transistor has high impedance whether turning on or turning off.

If you're running the output stage class A then you aren't really trying to discharge the whole 5.9 nF, because it is bootstrapped all the time. For class A, I think you probably have to worry about the Crss to AC ground plus maybe about 1/3 of the Ciss, to get an idea of time constant.

Unfortunately regarding circuit simulators and power mosfets: the device models used in circuit simulators were originally developed for lateral integrated circuit fets, and in my opinion the parameters, particularly those involved with capacitance and gate charge, are somewhat fudged by manufacturers who provide spice models, to give partially acceptable but sometimes misleading results -- IRF gate capacitance parameters sometimes seem rather weird ... Still, they should give you an idea.

I'm up too late rambling again; hope what I said makes some sense.
Do it yourself, I did.

Seeing the lazy man's way didn't work (asking others), I set up a source follower with a 40v supply, and a 10 ohm resistor load, and biased the gate so there was 20vdc across the load. Capacitively coupled a sinewave into the gate and measured the output voltage across the resistor at 200kHz both with zero ohms and 1k in series with the gate.

Zero ohms gave 5.77v rms, 1k gave 5.19v rms. I actually shorted out the resistor with a pair of pliers so the extra capacitance added to the gate from my hand through the insulated plier handles would make the results look slightly worse than reality.

The verdict? For 470 ohms in series with the gate the result is negligible as far as I am concerned. I did not measure the difference in phase shift though, an important consideration if there was a feedback loop around this setup.

This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.