Something turning around in my head lately:
Suppose I have a power amp with an ol gain that rolls off from 10kHz. Does that imply that any distortion products generated will also be attenuated if they are above 10kHz? I would think that it depends on the topology ie whether the distortion is predominantly generated before the roll-off pole or the other way around.
If the effect is true, and if I would extend the ol bandwidth of that amp to, say, 100kHz, all other things remaining the same, would the higher order distortion products increase (above 10kHz)?
If that is true, would that be a reason to limit ol bandwidth?
thanks for your insights,
jd
Suppose I have a power amp with an ol gain that rolls off from 10kHz. Does that imply that any distortion products generated will also be attenuated if they are above 10kHz? I would think that it depends on the topology ie whether the distortion is predominantly generated before the roll-off pole or the other way around.
If the effect is true, and if I would extend the ol bandwidth of that amp to, say, 100kHz, all other things remaining the same, would the higher order distortion products increase (above 10kHz)?
If that is true, would that be a reason to limit ol bandwidth?
thanks for your insights,
jd
above 10kh, there will be less loop gain, so the amp ability to
cancel the distorsion products using GNF deacrease proportionaly
to the increasing frequency..
so the higher the frequency, the more the distorsion..
limiting the amp s open loop bandwith doesn t limit his
ability to generate high order distorsion products, quiete
the contrary....
cancel the distorsion products using GNF deacrease proportionaly
to the increasing frequency..
so the higher the frequency, the more the distorsion..
limiting the amp s open loop bandwith doesn t limit his
ability to generate high order distorsion products, quiete
the contrary....
above 10kh, there will be less loop gain, so the amp ability to
cancel the distorsion products using GNF deacrease proportionaly
to the increasing frequency..
so the higher the frequency, the more the distorsion..
limiting the amp s open loop bandwith doesn t limit his
ability to generate high order distorsion products, quiete
the contrary....
Yes I understand the action of feedback. My question was about the open loop amp itself. What do you think about my considerations?
BTW Can I ask you to use Capital letters and interpunction where necessary to make it easier for us to read your posts? We also do that to make it easier for you to read our posts. Thanks.
jd
All the stages that are before the place where the roll off occur
will see their distorsion attenuated.
That said, in your exemple, despite having more distorsion at
higher frequencies, the amp with 100khz bandwith will be far
better as soon as the loop will be closed.
As an exemple let s take the classical LTP + VAS + OS...
If you implement the roll off in the vas , the LTP s distorsion
will be reduced at high frequencies , but it is pointless since
this is not this stage that has the higher THD.
will see their distorsion attenuated.
That said, in your exemple, despite having more distorsion at
higher frequencies, the amp with 100khz bandwith will be far
better as soon as the loop will be closed.
As an exemple let s take the classical LTP + VAS + OS...
If you implement the roll off in the vas , the LTP s distorsion
will be reduced at high frequencies , but it is pointless since
this is not this stage that has the higher THD.
Hi Jan
Assuming a fair amount of global feedback, then the opposite is true; reducing the ol bandwidth will increase high order distortions.
As you suggested, it works differently depending on whether the distortion is generated before or after the roll-off pole.
In the case where the distortion is generated after the pole (e.g. in the output stage), the roll-off will not reduce the distortions but it will reduce the amount of feedback available to correct the distortion, so the end result is higher distortion above the roll-off frequency.
The case where the distortion is generated before the pole (e.g. in the input stage) is a little more interesting.
In this case, the roll-off will reduce high-frequency distortions by a certain amount, but will also reduce the amount of feedback available to correct the distortion by the same amount.
These effects will tend to cancel out, so changing the pole frequency should have no effect on the final distortion (at the amp's output), provided that the distortion generated by the input stage does not change.
There's a catch there, though. When the input signal frequency is above the pole frequency, the global feedback will force the input stage to increase it's output, and this will cause it to produce higher distortion.
This get's nasty fast.
Let's take as an example a typical long-tail-pair input stage with predominantly third order distortion. The percentage distortion it produces is proportional to the square of the signal amplitude, so the total amplitude of the distortion is proportional to the cube of the signal amplitude.
So...
At the pole frequency, the input stage has to increase it's output by 3dB (i.e. about 1.4 times), resulting in about 2.8 times the distortion compared to low frequencies.
At about double that frequency, the input stage has to double it's output, resulting in 8 times the distortion obtained at low frequencies.
All the above ignores the questions of how much high-frequency content there is in music anyway, and whether we can hear anything above 20kHz.
Those arguments belong in other threads, though. 😀
Regards - Godfrey
edit: oops - I missed a few posts while I was typing, making coffee etc.
Time to catch up ...
Assuming a fair amount of global feedback, then the opposite is true; reducing the ol bandwidth will increase high order distortions.
As you suggested, it works differently depending on whether the distortion is generated before or after the roll-off pole.
In the case where the distortion is generated after the pole (e.g. in the output stage), the roll-off will not reduce the distortions but it will reduce the amount of feedback available to correct the distortion, so the end result is higher distortion above the roll-off frequency.
The case where the distortion is generated before the pole (e.g. in the input stage) is a little more interesting.
In this case, the roll-off will reduce high-frequency distortions by a certain amount, but will also reduce the amount of feedback available to correct the distortion by the same amount.
These effects will tend to cancel out, so changing the pole frequency should have no effect on the final distortion (at the amp's output), provided that the distortion generated by the input stage does not change.
There's a catch there, though. When the input signal frequency is above the pole frequency, the global feedback will force the input stage to increase it's output, and this will cause it to produce higher distortion.
This get's nasty fast.
Let's take as an example a typical long-tail-pair input stage with predominantly third order distortion. The percentage distortion it produces is proportional to the square of the signal amplitude, so the total amplitude of the distortion is proportional to the cube of the signal amplitude.
So...
At the pole frequency, the input stage has to increase it's output by 3dB (i.e. about 1.4 times), resulting in about 2.8 times the distortion compared to low frequencies.
At about double that frequency, the input stage has to double it's output, resulting in 8 times the distortion obtained at low frequencies.
All the above ignores the questions of how much high-frequency content there is in music anyway, and whether we can hear anything above 20kHz.
Those arguments belong in other threads, though. 😀
Regards - Godfrey
edit: oops - I missed a few posts while I was typing, making coffee etc.
Time to catch up ...
[snip]That said, in your exemple, despite having more distorsion at
higher frequencies, the amp with 100khz bandwith will be far
better as soon as the loop will be closed.
[snip].
But that will only be the case if the 100kHz bw amp has more gain at say 20kHz for the feedback to work with. If the ol gain in the pass band is the same in the case of 20kHz audio bw and 100kHz bw, I don't see any difference for the distortion performance in the audio band, with feedback.
[snip]As an exemple let s take the classical LTP + VAS + OS...
If you implement the roll off in the vas , the LTP s distorsion
will be reduced at high frequencies , but it is pointless since
this is not this stage that has the higher THD.
So you say that the major part of the distortion will happen at the end - in the output stage?
jd
Hi Jan
Assuming a fair amount of global feedback, then the opposite is true; reducing the ol bandwidth will increase high order distortions.
As you suggested, it works differently depending on whether the distortion is generated before or after the roll-off pole.
In the case where the distortion is generated after the pole (e.g. in the output stage), the roll-off will not reduce the distortions but it will reduce the amount of feedback available to correct the distortion, so the end result is higher distortion above the roll-off frequency.
The case where the distortion is generated before the pole (e.g. in the input stage) is a little more interesting.
In this case, the roll-off will reduce high-frequency distortions by a certain amount, but will also reduce the amount of feedback available to correct the distortion by the same amount.
These effects will tend to cancel out, so changing the pole frequency should have no effect on the final distortion (at the amp's output), provided that the distortion generated by the input stage does not change.
There's a catch there, though. When the input signal frequency is above the pole frequency, the global feedback will force the input stage to increase it's output, and this will cause it to produce higher distortion.
This get's nasty fast.
Let's take as an example a typical long-tail-pair input stage with predominantly third order distortion. The percentage distortion it produces is proportional to the square of the signal amplitude, so the total amplitude of the distortion is proportional to the cube of the signal amplitude.
So...
At the pole frequency, the input stage has to increase it's output by 3dB (i.e. about 1.4 times), resulting in about 2.8 times the distortion compared to low frequencies.
At about double that frequency, the input stage has to double it's output, resulting in 8 times the distortion obtained at low frequencies.
All the above ignores the questions of how much high-frequency content there is in music anyway, and whether we can hear anything above 20kHz.
Those arguments belong in other threads, though. 😀
Regards - Godfrey
edit: oops - I missed a few posts while I was typing, making coffee etc.
Time to catch up ...
OK, let me think about that. But I do get your point that the increase in distortion above, say, 20kHz will not happen unless there is signal above 20kHz. And this again could be a reason to roll off the input signal above 20kHz.
jd
If the effect is true, and if I would extend the ol bandwidth of that amp to, say, 100kHz, all other things remaining the same, would the higher order distortion products increase (above 10kHz)?
jd
According to this sentence ,it is assumed that the amp will have forcibly higher gain at 20khz.
Be carefull when expressing the initial conditions.
Vas and output stage are the main providers of THD.
i forgot to add that in some extents, topology matters.
http://www.diyaudio.com/forums/soli...pology-subjective-effects-35.html#post2076926
http://www.diyaudio.com/forums/soli...pology-subjective-effects-35.html#post2076926
According to this sentence ,it is assumed that the amp will have forcibly higher gain at 20khz.
Be carefull when expressing the initial conditions.
Vas and output stage are the main providers of THD.
No I did not mean that the gain would change with the bw. I want to compare two cases: ol gain crossover of say 10 or 20kHz and of 100kHz. In each case, ol gain would be flat from DC to the cutoff point, and be the same in both cases.
What would be the effect of the larger ol bw on the distortion?
It seems that for the part of the distortion generated before the roll off pole, the lower bw would give lower distortion, correct?
Since the 2nd harmonic of 10kHz is 20kHz (assuming we agree on 20kHz as the highest audio freq), it seems advantageous to limit ol gain to 10kHz.For that part of the distortion generated after the roll off pole, there would be no difference wrt distortion, so the ol bw, for this reason, would be irrelevant.
jd
ol gain crossover of say 10 or 20kHz and of 100kHz. In each case, ol gain would be flat from DC to the cutoff point, and be the same in both cases.
Since the 2nd harmonic of 10kHz is 20kHz (assuming we agree on 20kHz as the highest audio freq), it seems advantageous to limit ol gain to 10kHz.
jd
I ve made some sims about frequency roll off influence in DHT products.
Sims where made using doug self blameless as test amplifier.
so far, the (expected) result is that reducing the frequency of the roll off
yield lower level of high frequency THD , BUT , the low order
THD is increased.
Modification of an entry function is closely related to the amp
impulse response.
Say the entry function is sin(t).
Since the amp speed is limited according to the implemented roll off,
the output s maximal error will occur when dv/dt is at its highest value, i.e , at the zero crossing points.
Symetricaly, minimal error will occur when the slope of the
signal is minimum , i.e , when the sine will be at the maximum
value.
Thus , the large bandwith amp having faster impulse response,
it will be less prone to signal reshaping (distorsion) , even if the
frequency is lower than the transition frequency induced by the roll off.
Of course, all this in open loop conditions.
I ve made some sims about frequency roll off influence in DHT products.
Sims where made using doug self blameless as test amplifier.
so far, the (expected) result is that reducing the frequency of the roll off
yield lower level of high frequency THD , BUT , the low order
THD is increased.
Modification of an entry function is closely related to the amp
impulse response.
Say the entry function is sin(t).
Since the amp speed is limited according to the implemented roll off,
the output s maximal error will occur when dv/dt is at its highest value, i.e , at the zero crossing points.
Symetricaly, minimal error will occur when the slope of the
signal is minimum , i.e , when the sine will be at the maximum
value.
Thus , the large bandwith amp having faster impulse response,
it will be less prone to signal reshaping (distorsion) , even if the
frequency is lower than the transition frequency induced by the roll off.
Of course, all this in open loop conditions.
Agreed, but that is only true if the decreased ol crossover freq also means lower slew rate. And that is not necessarily the case, is it?
jd
Agreed, but that is only true if the decreased ol crossover freq also means lower slew rate. And that is not necessarily the case, is it?
jd
If we are in open loop conditions, slew rate is in direct proportionalty of gain/bandwith product , and thus of frequency roll off.
If we are in open loop conditions, slew rate is in direct proportionalty of gain/bandwith product , and thus of frequency roll off.
Really? That's new for me. Hmmm.
I always thought that slew rate is a large signal effect and bw is a small-signal effect. I can imagine an amp that has a gain up to say 100kHz for small signals but that cannot put out a large signal at 100kHz due to slew rate limiting.
jd
Last edited:
Something turning around in my head lately:
Suppose I have a power amp with an ol gain that rolls off from 10kHz. Does that imply that any distortion products generated will also be attenuated if they are above 10kHz? I would think that it depends on the topology ie whether the distortion is predominantly generated before the roll-off pole or the other way around.
Feedback loop is a loop and gain is gain, so any error will be corrected anyway by the amount equal to gain.
Feedback loop is a loop and gain is gain, so any error will be corrected anyway by the amount equal to gain.
Yes, but as I have explained earlier, I'm interested in the open loop properties, no fb.
jd
Really? That's new for me. Hmmm.
I always thought that slew rate is a large signal effect and bw is a small-signal effect. I can imagine an amp that has a gain up to say 100kHz for small signals but that cannot put out a large signal at 100kHz due to slew rate limiting.
jd
Here a simulation of step response under open loop
conditions.
Input signal is set to 10mV, high enough to saturate the amplifier
output and low enough to allow differentiation of response speed.
Attachments
Here a simulation of step response under open loop
conditions.
Input signal is set to 10mV, high enough to saturate the amplifier
output and low enough to allow differentiation of response speed.
Right, that's the large signal response (output goes to the supply rail I assume), so the slew rate is apparent.
Can you run the same with a signal that gives an output of say 1V? BTW what's the amp ol gain here? What's the input step signal rise time?
jd
Can you run the same with a signal that gives an output of say 1V? BTW what's the amp ol gain here? What's the input step signal rise time?
jd
Setting the signal so the amp stay in low output amplitude,
the figure show the other side of the equation :
the setting times are equals, But , the more compensated
amp run out of gain, which is logic, since the gainbandwith
product is less..
thus, the limitation will appear not in the rise time, but in
the maximum amplitude, leading to the same result, i.e ,
a lower slew rate.
Attachments
Setting the signal so the amp stay in low output amplitude,
the figure show the other side of the equation :
the setting times are equals, But , the more compensated
amp run out of gain, which is logic, since the gainbandwith
product is less..
thus, the limitation will appear not in the rise time, but in
the maximum amplitude, leading to the same result, i.e ,
a lower slew rate.
I don't think I agree with that, the bw is a small-signal property and should be measured with a something like a swept sinewave. An input impulse always is very wideband so has components beyond the cutoff freq. That's why I asked for the amp gain and the impulse risetime. If you would filter the impulse it would be different.
But, we can agree to disagree 😉
jd
- Status
- Not open for further replies.
- Home
- Amplifiers
- Solid State
- Bandwidth vs distortion?