Some basic question about power supply design fundamentals

If there is a document I've not found that covers it, please send me there instead. But, I have some basic questions to help me clarify my understanding of power supply design choices:

1. The transformer provides the power used by the system, which means the transformer needs to be rated at least as high as the maximum draw expected from the system. If you know a system will only ever draw a maximum of 300W, is there any benefit from using a transformer greater than 300VA?

2. I understand that the filter capacitors smooth out the DC voltage and provide voltage when the draw from the system is greater than the inflow of electricity through the transformer. How does the filter cap capacity relate to the circuit that will draw on it? IE, how do you know how many farads you need for a given circuit? And is there any benefit to adding more?

3. What are the benefits of dual-mono power supplies? I understand in principle that having a separate power circuit for each amplifier circuit has benefits, but I'm not clear on exactly what or why. Additionally, Nelson Pass describes a "dual mono" power supply with one transformer feeding two bridge rectifiers and two capacitor banks. To what extent does this achieve the benefits of dual-mono design, and what is the tradeoff of using one transformer vs. two?
 
1. It's not quite that simple because the VA rating applies when the load is a linear resistor. A rectifier with a big capacitor draws nasty current spikes that have a higher RMS value than the DC current drawn from the output - a rule of thumb is about 60 % more with full-wave rectification. The DC voltage across the capacitor can also be higher than the nominal RMS voltage from the transformer, but not by 60 % at full load. At very light loads, it's sqrt(2) times the unloaded transformer output voltage minus the drop across the rectifier, at higher loads it will drop due to ripple voltage and losses in the transformer.

All in all, a transformer rated for 300 VA will usually be insufficient for a continuous 300 W load. You can probably get away with a smaller transformer than would be needed for a continuous 300 W load when you know for sure that the load only draws 300 W for short periods of time (much shorter than the thermal time constant of the transformer) and much less in between.

2. The capacitor gets charged by current spikes that occur 100 or 120 times per second with full-wave rectification (at 50 or 60 Hz mains), in between the load current partly discharges it. That will cause a peak-peak ripple of almost the time between the current peaks times the load current divided by the capacitance. The trick is to establish what ripple the circuit can handle and how much current it draws, you can then solve the required capacitance: C ~= IDC/(2 f Vpp) for full-wave rectification and the same expression without the two in the denominator for half wave. IDC is the DC current drawn from the output, f the mains frequency, Vpp the peak-peak ripple voltage and C the capacitance.

3. You eliminate a potential crosstalk path between left and right, but I doubt if it is worthwhile, as an amplifier has to have a good suppression of ripples on the supply anyway.
 
Last edited:
Correction: "much shorter than the thermal time constant of the transformer" should be "much shorter than the thermal time constant of the transformer and short enough not to blow the mains fuse". The mains fuse should always blow before the transformer overheats, so that will be the real constraint.
 
PSU question

Hi, I add a double rectifier psu diagram. Here rectifiers input different, transformer has 4 output not 3. If we use 3, 28-0-28v transformer, is this double rectifier design is also usable or we can use the benefit of two rectifier? psu4.jpg
 
No, I don't see a way to use such a scheme with a centre-tapped transformer without creating a short circuit 50 or 60 times per second.

I also don't see what the advantage of this circuit is compared to one with a single bridge rectifier, except that the single bridge rectifier needs twice the voltage rating.
 
The DC voltage across the capacitor can also be higher than the nominal RMS voltage from the transformer, but not by 60 % at full load. At very light loads, it's sqrt(2) times the unloaded transformer output voltage minus the drop across the rectifier, at higher loads it will drop due to ripple voltage and losses in the transformer.

The rating on a capacitor is the maximum voltage it is rated for, not necessarily the voltage that actually comes out of it, right?

So, if my circuit calls for a 25vdc supply voltage, and I provide 25v capacitors and an 18v transformer, the voltage supplied to my circuit will by sort(2) * 25 = 25.46v, minus the drop across the rectifier, minus ripple voltage, minus transformer losses. Does that mean the voltage my circuit sees will always be varying slightly?
 
Yes, the rated working voltage is the maximum voltage the capacitor can handle while meeting the manufacturer's lifetime and reliability targets. You would therefore normally use 35 V or 40 V working voltage capacitors for a 25 V supply.

Assuming it's a class-B or class-AB amplifier that you want to supply, the quiescent current will be far below the maximum current. The worst-case voltage will then be close to the no-load output voltage of the transformer at maximum mains voltage times the root of two.

Assuming a transformer that gives 10 % more than the nominal voltage when it is not loaded and a +/- 10 % mains voltage tolerance, it can be almost 21 % above your 25.46 V, so close to 31 V, maybe 30 V accounting for rectifier voltage drop. A capacitor rated for 35 V or 40 V can easily handle that, a 25 V capacitor cannot.

The actual voltage will indeed vary depending on mains voltage variations, load variations and due to ripple.
 
Depends on the amplifier circuit. If it was designed to work with unregulated supplies, like most amplifier circuits are, then any competent designer will have made sure that the maximum voltage the amplifier can handle is well above the specified nominal voltage.
 
So, if my circuit calls for a 25vdc supply voltage, and I provide 25v capacitors and an 18v transformer, the voltage supplied to my circuit will by sort(2) * 25 = 25.46v, minus the drop across the rectifier, minus ripple voltage, minus transformer losses. Does that mean the voltage my circuit sees will always be varying slightly?

Yes, it will always vary a bit. That 18V of your transformer is normally given under (resistive) load. The manufacturer will also provide the regulation of the transformer. Let's say that you have 10%. But it is also given for a target input VAC, which can also vary, at least by 10% up and down.

Under minimal load, your 18VAC transformer now outputs 19.8VAC. If it's a bad day and the incoming mains is higher than specced, you need to multiply again by 1.1, that makes 21.78. Now multiply by 1.414, substract some for the diodes... 35V caps would be nice.

On the other hand, if you draw a heavy current through a cap input supply, you'll have heavy peaks and thus less than 18V to start with. And if it's a bad day and the mains are a bit low, even less.

This is another good link to Hammond's website, with some basic formula: Design Guide For Rectifier Use - Hammond Mfg.

edit: Marcel posted way faster than I 😛
 
Yes, if one "reads" a voltage at 25V, and your capacitor is rated at 31V, you should be OK, but the caveat is this cap is running at nearly its full rating & the expected life of that cap will be somewhat short.
This is where we can become the "dog chasing its tail", let me explain...I had read of the idea of choosing a capacitor rated well above that voltage that it could ever see, thus a formula was thought out that if one doubled a rated voltage, the expected lifetime would last three times longer...extending into the absurd, one could multiply the rating by, say 100 times...and given an expected ONE YEAR lifetime at just shy of its rating, could theoretically last 10,000 years with a monster cap, that's crazy. I would say a cap rated at three times whatever it would ever see is just fine.
Further, in an effort to reduce ripple, one might be tempted to keep piling on more and more capacitance, from say 7800 uF to 40,000 uF and further & further...but as that value climbs, the dead short of trying to charge up a cap gets longer & longer, taxing the transformer more and more...so the logical solution was to get a larger & larger transformer...so much so as to bump up to the current limitations of the wall current.
Overbuilding a power supply system is good, but only up to a certain point.
We don't want ten kilos of a transformer when a quarter of that will do just fine.







----------------------------------------------------------------------------Rick.....
 
Last edited: