Diff amp input cap for improved phase margin

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
FYI-

Deane Jensen's paper on stabilizing op amps is here:

http://www.jensentransformers.com/an/an001.pdf

On page 3 of this paper, he has a section called "Source Impedance Effects" in which he discusses this topic.

However, he talks about DECREASING the phase margin with lower input impedance, which appears to be opposite to what has been said in this thread.

I can only presume that this assumes a constant input impedance source, vs. a source that drops with frequency by an input capacitance that increases the gain only at very high frequencies.
 
more of chicken or egg sort of question:

was the miller comp dominant pole frequency selected with the (very modest) loop gain boost from ac grounding the positive input in place or was the ac input short added after tuning the miller comp?

by tuning the compensation and then changing the amp it seems a little unfair to call the change "destablizing"
 
www.hifisonix.com
Joined 2003
Paid Member
Jcx,

you raise an intersting point about boosting loop gain by providing some positive feedback via the non-inverting input of the amp. I've seen this technique used on quite a few amp designs. I've never tried it, but does anybody care to comment on its effectiveness at raising Zin and helping to reduce distortion? Is there an optimum % or ration between the -ve feedback and the +ve feedback?
 
Good to see this being discussed.

Andrew you wrote >>
so the rise in gain of the first stage could cause some HF overshoot. <<

I would add - especially when that first stage is driving a VAS connected C.dom as well as VAS base, such that the first transistor current is more quadrature related to VAS collector voltage at higher frequencies, (hence stage gains are no longer input voltage linear within the closed 'flat' global NFB loop) whilst the loudspeaker current itself might already have become reactively phase shifted wrt input signal voltage.

Where the input transistor base current is obliged to vary in relation to slew and output current, any series input resistance will develop a non-linear voltage error wrt to source potential, esp at hf. So the lower the series base/bias resistance the lower the NFB loop induced error.
This error is very small but additionally exacebated by class-AB crossover.

The NFB applied to the error sensing base of the differential pair is normally at a suitably low source impedance for a bipolar transistor, so why not for the input base too ?

When an amplifier has a low input/source resistance say 1k and less, not only will there audibly be less hf distortion, but hiss and hum can become virtually inaudible. Also if the internal stage voltage gains are flat throughout AF (ie. stability components not causing damping phase change within the AF band) then crossover distortion will be much less noticeable via electrodynamic loudspeakers.


Cheers ....... Graham.
 
Hi pooge
However, he talks about DECREASING the phase margin with lower input impedance, which appears to be opposite to what has been said in this thread.

I have not looked into this, but at first sight it does not seem to make sense.

If you have several stages with delays, speeding up one of them should increase the phase margin. Consider three stages of an amplifier each with equal delays. putting feedback around this is almost certainly going to make an oscillator. Raising the frequency response of one stage should reduce the overall phase shifts and (depending on the feedback) may make it stable i.e. increase the margin.

But I will dig into this some more when I have time.

cheers
John
 
pooge said:
FYI-

Deane Jensen's paper on stabilizing op amps is here:

http://www.jensentransformers.com/an/an001.pdf

On page 3 of this paper, he has a section called "Source Impedance Effects" in which he discusses this topic.

However, he talks about DECREASING the phase margin with lower input impedance, which appears to be opposite to what has been said in this thread.

I can only presume that this assumes a constant input impedance source, vs. a source that drops with frequency by an input capacitance that increases the gain only at very high frequencies.
is Jensen referring only to inverting amplifiers in this paragraph where stray capacitance on the inverting pin does cause a stability problem? Or is he considering the general case and just referring to the input pins in either configuration?

john_ellis said:
I have not looked into this, but at first sight it does not seem to make sense.

If you have several stages with delays, speeding up one of them should increase the phase margin. Consider three stages of an amplifier each with equal delays. putting feedback around this is almost certainly going to make an oscillator. Raising the frequency response of one stage should reduce the overall phase shifts and (depending on the feedback) may make it stable i.e. increase the margin.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.