Voltage effects on DC offset and bias

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
I have been wondering about something and hope you have an answer for me.

If I build an amp, let's say an F4, I understand how and why I need to check DC offset and bias. So far no problems.

Let's assume that I have plenty of heatsink and that my caps are rated high enough. What happens when I replace my transformer by one with a higher or lower output voltage?
Will the DC offset still be OK?
I assume that it will be, because I assume that the DC offset is (to put it mathematically) a function of the parts and connections in the circuit (and the rail voltage will play only a minor role). Is this correct?

And what happens to the bias?
What happens when the rail voltage is 30% higher?
Or 30% lower?

I assume that both DC offset and bias are not be affected (much) by rail voltage because otherwise the +/- 10-15% voltage fluctuation in domestic AC supply would be enough to screw up the settings. Confirmation would be nice though :)
 
Last edited:
if +rail go up 10% and -rail does not, DC offset goes up on pluss side.
transformer sec is not allways 100% same. so that can effect DC offset. but not by alot.

Audiosan, I understand what you mean, but I am not so much interested in the practical problems that can occur. Just the theory.

For the sake of argument, let's say the rail voltages are +24.000V and -24.000V and DC offset is 0V and bias is 1A.
What happens to DC offset and bias when rail voltages are changed to +30.000V and -30.000V?
Or to +18.000V and -18.000V?
 
I'm no EE, but my take so far is that any change in rail voltage will cause a corresponding change in dissipation for any given bias due to Watt's law. More voltage = more dissipation. More voltage also means stressing the parts more, hence the reason for cascodes. I also surmise that a change in rail voltage will ultimately change the bias as well. DC offset shouldn't change as others suggest, provided both rails increase or decrease by the same amount. Anecdote: My NAD 2200 has two rails -- 65v and 95v, and let me tell you, even when running at idle on the 65v rail with minimum bias (AB), those sinks get quite warm. It's one of those amps you can smell across the room after it's been on an hour or so. Ah, I love the smell of burning amps in the morning. Smells like, victory!
 
I'm no EE, but my take so far is that any change in rail voltage will cause a corresponding change in dissipation for any given bias due to Watt's law. More voltage = more dissipation. More voltage also means stressing the parts more, hence the reason for cascodes. I also surmise that a change in rail voltage will ultimately change the bias as well. DC offset shouldn't change as others suggest, provided both rails increase or decrease by the same amount. Anecdote: My NAD 2200 has two rails -- 65v and 95v, and let me tell you, even when running at idle on the 65v rail with minimum bias (AB), those sinks get quite warm. It's one of those amps you can smell across the room after it's been on an hour or so. Ah, I love the smell of burning amps in the morning. Smells like, victory!

we'r not talking about dissipation. but bias. ex. 1.3A at 18V and it still will be 1.3A at 24V.
 
I think I understand that theoretically DC offset will not change by raising or lowering rail voltage. I think this is not the case for the bias, though.

In the Son of Zen (http://passdiy.com/pdf/sonofzen.pdf) amp, the bias is a result of the power drawn by resitors R1-R7. I assume that this is a simple application of Ohm's law (V = I x R). If that is indeed the case, this must mean that the bias is directly proportional to the voltage!

Since R is (effectively) constant in this circuit, V/I must be constant and hence, changing the voltage by any percentage must change the current by the same percentage.

Two questions:

1.
Is the above correct (i.e. in a circuit like Son of Zen, bias depends on voltage)?

2.
What happens in other circuits, which use different topologies to create a constant current source?
 

PRR

Member
Joined 2003
Paid Member
> I assume that this is a simple application of Ohm's law (V = I x R). If that is indeed the case, this must mean that the bias is directly proportional to the voltage!

You have over-simplified "simple application".

The SoZ is incredibly stable against supply change. Many circuits are not so stable. Some are far-far fussier.

In the SoZ, *neglecting the cross-Source resistor*, the bias current is the resistors and the voltage from V- to Source. Source voltage is the zero volts at Gate plus the MOSFET's gate-source voltage at that current.

So we have Vgs which is not precisely defined, and the Vgs/Is variation which is also not precisely known.

Therefore V or I are not constant. (Yes, V/I is constant: it's a resistor.)

If V- is huge compared to Vgs, we can wave this away. If V- were -300V, then Vgs of 2V or 5V would be 302V or 305V, 1% change for extreme MOSFETs. However in this application the V- is more likely 30V.

In this circuit, if the MOSFETs are from the same lot (same die size and mask and cooking), then if they gave tolerable match and offset at 25V they would *probably* be OK at 35V. Is and VGS changes, but less than V-, and by nearly the same ratio in both devices of the pair.

We can not neglect the cross-Source resistor. If the change in supply and current causes a 0.1V shift in Vgs, one more than the other, that would be 0.1V/1 or 0.1 Amps offset change, and something less than 0.8V offset change at the speaker. This won't melt a speaker, or cone-offset a stiff woofer, but might put a low-excursion high efficiency driver out of its magnetic center.

The bias current will change nearly proportional to the voltage. This is correct. We apply more supply voltage to get more speaker voltage. With more speaker voltage the speaker current must increase. So we want the bias current to increase.

What else happens? A 30% increase of supply voltage is also a 30% increase of supply current, and a 69% increase of HEAT. If it was idling at 300 Watts heat before, now it idles at 507 watts heat. If the heatsinking was just-OK before, 69% more is likely to be a hot time melt-down.

> What happens in other circuits, which use different topologies to create a constant current source?

Other things. Use Ohms Law carefuly, and think about the whole circuit.
 
The reason I started this post was that I wondered what happens when supply AC changes.
If DC offset and bias depend on DC voltage (given the type of power supplies in use) they depend on the AC supply.

As AC supply can change by quite a bit (+/- 10-15% or so?), this would mean that if you set DC offset and bias when AC supply is at (or close to) either extreme (i.e. lowest or highest you ever actually get from the AC company), you could end up with trouble when AC supply swings towards the other extreme.

I found no reference to check AC voltage when setting DC offset and bias. I thus concluded* it could not be such a big deal (even with a potential 20-30% AC voltage difference).

So, while the builds show it should not matter, theory says it might.
Perhaps a more experienced builder or engineer could comment.

Thanks,

Albert


* Given the number of DIY amps built, it is likely that some will have set their DC offset and bias at the extreme supply AC voltages.

In case you were wondering, yes, I have worked with digital systems for the last 30 years.
 
AFAIK, European countries had +10/-15% specs for supply and any design should account for such span.

In theory, Watt's and Ohm's Law are applicable as a part of a complex multi dimensional equasion for bias and DC offset change with supply rail change.

Multidimensional nonlinearity of semiconductors command change of parameters with change of supply volts but circuit design may muffle or amplify these effects.

Uneven bias change on each half of the PP amp (due to different supply volts change for each half) would create DC offset.
 

PRR

Member
Joined 2003
Paid Member
This gets back to whether the designer considered voltage variation.

IMHO, voltage variation MUST be considered.

Old-time designers usually left allowance. Back in days of tubes the "Design Center" voltage spec might be 300V, but the "Design Max" spec might be 330V.... the Center spec covered reasonable wall-voltage variation and was used in routine designs, the Max spec was for designers who carefully considered (or measured, or regulated) supply voltage.

And design methods were mostly approximate, so it was un-wise to get too close to any limit.

OTOH, some designers trust computers and SPICE un-reasonably. I have seen posted circuits which would quit or burn if supply varied even a few percent.

> setting DC offset and bias

If the offset trim method is well choosen, 10% variations of supply have little effect. OTOH sometimes you see a transistor offset "balanced" by a fraction of the supply voltage. These will go out of balance for very small supply change.

> AC supply can change by quite a bit (+/- 10-15% or so?)

The real limit on what comes out of the walls, in most areas, is Incandescent Lamps. The life of a white-hot filament varies as the 13th power of voltage. A 10% rise cuts life to 1/3rd. While people do not keep track of lamp life, which is always random, they do notice if a lot of lamps die very quickly. I have complained to my utility company about steady 127V. While there has been a century-long trend from 100V to 110V to 120V, the voltage is pretty steady year to year in most utility systems, and usually much closer than 10% of nominal.

In DIY, the instructions to set bias and offset without regard to line voltage assume that you are adjusting with nearly the same power you will use it at. Maybe 119V in your workroom at midnight and 118V in the music room in the evening; no big difference. When a manufacturer is setting up amps for wide distribution, they would think about worst-case wall voltages and adjust appropriately.
 
Maybe I am an old-timer, but I always look at survivability during worst case operational conditions.

In the UK I assume that sometime during the life of my equipment it will have a 254Vac mains voltage as the supply.
It must be designed to survive this without damage. It would be nice if the equipment could operate normally over the full range of 216Vac to 254Vac of mains supply.
 
OK, let's see if I understand this correctly:

Problem 1.

Assumptions:

  • the F5 it is a well designed amp :cool:
  • my PSU has 63V rated caps
  • I have very, very large heatsinks (no heat issues)
  • PSU uses 2 x 18V secondary transformer
  • voltage on both secondaries identical to within 0,1%
  • I have a spare 2 x 24V secondary transformer
  • voltage on both secondaries also identical to within 0,1%
  • JFETs can handle 34V

If the above assumptions are true then I should be able to drop in the 2 x 24V secondaries transformer without any problems.

DC offset should be OK because voltages in secondaries are identical to within 0.1%.
Bias should be higher that before but still OK (assuming MOSFETs can dissipate the heat).


Problem 2:

Assumptions:

  • AC supply at my house is 197V
  • AC supply at my friend's house is 250V
  • heatsinking is adequate to allow for the extra heat generated (ca. 60%!)
If DC offset and bias are set at my house, the amp will work correctly at my friend's house.


Comments?
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.