transformer VA and supply voltage?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Hi,
it's been said, but argued over many times and I cannot find a definitive answer.

A transformer specified as 230Vac to 40+40Vac @ 12.5Arms =1000VA.

If this same transformer is run from 240Vac it's output voltage @ full rated AC current rises to 41.74Vac giving an apparent 1043VA rating.
If run on 120Vac, I think the transformer becomes 522VA.
Or will the very low supply voltage generate less wasted heat and so the windings could then deliver slightly more current without overheating resulting in slightly higher than 522VA? How much higher?

Just what does happen to the VA rating when a transformer is run from different supply voltages.
 
Andrew,

A definative answer to your question depends on some xformer specific information that the end user is not going to have unless he specified the xformer in detail (core size, core material, wire gauges, wire layering, etc.) and the manufacture executed per instruction.

However, some tradeoffs for xforer construction do lead to some rules for adjusting xformer ratings. In general a power xformer must accomodate core losses and wire losses and still keep the hot spots in the transformer from overheating the insulation or the core. Towards this end the xformer rating apportions a ratio of core heat loss to wire heat loss. Very often this ratio is one-to-one. That is 50% of the heat budget to each contributor.

Ferrite or Iron cores all have nonlinear losses with increases in peak flux density. The higher the flux density the greater the loss and the rate of loss increases per unit increase in flux density.

For sine wave excitation the typical formula for peak flux density in a xformer is as follows:

B = (Vrms * 10^8) / (4.44 * N * A * f)

Where B = flux density (gauss)
N = primary turns
A = effective core cross-section in cm^2
f = frequency in Hertz

So for the case of a 230V xformer moved to 115V service, constant excitation frequency, the peak flux will not change. Thus the core loss will be unchanged.

If the primary is a single winding then the wiring losses will be related to I^2 * R. Thus, increasing the current from the 230V rating will increase the wiring loss on the primary and the total heat budget. This will reduce reliability.

However, if the original xformer had a split primary of two windings with equal turns on each winding; then for 230V they are wired in series and for 115V they are wired in parallel. The parallel connection will sustain twice the input current with the same heat losses as the series connection. Thus, the VA of the xformer would be the same for either 115V or 230V excitation.

The above ignores the contributions from magnetization current based on the inductance of the primary. This current flows in quadrature to the heat losses and a resistive load current. However, this current can be 30-50% of the rated load current. Note, for reactive loads this becomes a much more complicated analytical problem.

In the case of a single primary winding, the magnetization current will be halved for 230V to 115V service. By suitable vector arithmetic, measuring of inductance or no load primary current; an increase to the derated 115V VA can be allocated by equalizing heat losses for 230V load plus magnetization current to 115V conditions.

For the split primary conditions the same process could be used, but note the 115V primary inductance will be 0.25 the 230V case. This comes from noting that inductance is proportional to the square of the turns. The net of the turns and excitation changes will be a doubling of the magnetization current, but split across two windings in parallel. Thus, no change from the 230V case for winding heat losses.

Final warning. For changes in both voltage and frequency, the peak flux formula may be optimistic. In particular transformers rated for 60Hz service may suffer more than the formula suggests when used for 50Hz service. The reverse is benign. Also, core materials used for 50 or 60Hz will work happily at say 400Hz, but parasitic effects may become overly important (like leakage inductance).

I hope this helps you think about this problem,
VSR
 
I think it changes in inverse proportion to the output voltage. You have treat a transformer as a constant current device since the limit is really keeping the the core out of flux saturation; and the induced (output) voltage is proportional to L*dI/dT.

So VA stays constant if you swap between 240v or 120v input by arranging the primares in series or parallel, but the current output rating should remain unchanged limited by flux in the core. Copper and iron losses will be pretty small and only really are dictated by the allowable temperature rise in the transformer. I doubt that teh difference between 230va nd 240 has a significant effect unes you are running a transformer right to the - its a 6% change in output voltage, so a 12% maximum change in dissipation at full load (resistive - 1.06V * 1.06I)

edit: beaten to the punch by VSR - interesting, thanks.
 
Hi,
thanks for that.
Seems the process is quite complicated, no wonder previous enquirers got conflicting views.

Anyone else want to offer their opinion/calculation?

By the way, I am not asking about what happens when one parallels/series a two primary transformer to suit it's alternative rated input voltages.
 
An error in my explaination!

Andrew,

In my response to your question I made the following statement:

>"So for the case of a 230V xformer moved to 115V service, constant excitation frequency, the peak flux will not change. Thus the core loss will be unchanged."

This is only true for the split primary case as the reduction in excitation is balanced by the reduction in the turns on the primary.

For the single primary case this is not true as the turns on the primary are unchanged. Thus the peak flux will be reduced by 0.5 for 115V excitation. What can be done with this depends on whether the heat budget is split 50-50 or otherwise. The nonlinearity of the core loss with peak flux density should guarantee that the heat loss has been reduced by at least 50% for the core portion of the heat dissipation. But, without knowing what the transformer designer engineered, there is some risk in assuming a 50-50 split and reallocating the saving back to increasing I^2 * R up to the supposed total heat budget (that is preserve the original total power dissipation by allowing resistive losses to increase exactly as much as the core loss has fallen).

You clearly have some rational for increasing the VA at 115V from a 50% discount. But how much and how to justify it without having a very detailed xformer design?

There is also some heat dissipation engineering. The core has reduced heat losses, but we want to raise the wire losses. How does this change the hot spot temperature for the core and the wire?

I believe there is more than enough for several hand waving exercises here.

Sorry for the error in my post.
VSR
 
Hi,
thanks again.

I was surprised at your suggestion that the transformer designer would choose to make the heat losses in the copper and the iron near 50:50 split.
I had imagined that the iron loss was a lot less.
Firstly, since the iron has no cooling surface (in a toroid) it will cook the copper from the inside.
Secondly, most transformers now run the current at upto 3.1A/sqmm rather than 1.6A/sqmm. This increases the I^2R losses and increases the regulation (this seems borne out by modern toroids having less copper and higher regulation than cheap but older toroids).
Thirdly, since toroids are so much more efficient and this efficiency comes from the improved core structure, I thought the core losses would be lower.

If someone were to say that core loss~=primary copper loss~= secondary copper loss then I would think that may be closer to my own thoughts. I suspect (want to believe) that iron loss is even lower.

Is there a way to measure the idle current and use this to estimate the iron loss when under load?
 
Hi

It is true that the best efficiency compromise can be reached when copper and iron losses are balanced, but in the case of a 50/60Hz transformer, it isn't a practical option: this would require a huge amount of copper, and copper is more expensive than iron...
This was sometimes done however: in vintage equipement of the '20 - 30', you could sometimes find bizarre transformers, with huge windings and a skinny magnetic circuit.
LV
 
VAC x 1.414 - 2 = VDC. e.g 20vac = 26.28vdc

This assumes approximately 1 volt drop for each diode given there will be 2 in series in a bridge rectifier at any given time. Also assumes the transformer has zero impedance. In practice the peaks of the AC voltage will be flattened a little leading to lower DC voltage under load.

Another thing - if you run a 230v transformer at 115vac you will =never= get the inrush current problem at switch-on that you otherwise get when the transformer saturates if you happen to turn on at close to the zero crossing of the AC waveform.
 
puginfo,

If your question is: What voltage to expect at the output of a capacitor input filter when fed from a fullwave bridge rectifier? Then the answer needs some qualification and is often computed by simulating the circuit components. To crank out an answer by hand; one needs to decide whether the output capacitance selected is going to have an output capacitor ripple which is high (say 20% or more) or low (say 5% or less). For audio circuit use; the answer is most likely low ripple. For the intermediate ripple condition I don't have a good way to estimate except via simulation.

For Low Ripple:

Let Circlotron's VDC equal Vpk (I would use a Vdiode of less than 1V/diode, however. Try 0.6V, as in most audio circuits the rectifier diodes are vastly over-rated compared to commercial practices). That is Vpk = (Vrms*(2^0.5))-(2*Vd).
Note, Vrms should, ideally, be the measured, no load transformer output, not the manufacture's Vrms rating.

Let Re be the equivalent resistance of the secondary taking into account the secondary winding resistance, primary winding resistance and misc. primary circuit resistances (fuse, common-mode choke, etc.). The primary resistance needs to be reflected to the secondary via the turns ratio squared (for step-down the result will be to lower the primary resistance) and sum it with the secondary resistance. Or you can use an equivalent xformer circuit and derive a more exact voltage divider expression (probably not worth the effort for this estimate).

Let Iload be your desired output current.

Then guess an output voltage: Vg (say 10-20% less than Vpk)

Calculate Ag = arcsin(Vg/Vpk), in degrees
Calculate Ad = 2*(90-Ag)
Calculate Dc = Ad/180
Calculate Ig = Iload((1/Dc))+1)
Calculate Vout = Vpk-Ig*Re

Compare Vout to Vg. If within 0.1V stop; otherwise use Vg = Vout and repeat. You should have an estimate in 2 to 4 iterations.

If you use a fullwave center tapped configuration; then calculate Vpk equal 0.5 the full winding value (full winding equals both halves, summed, ignoring center tap, and retain 2 Vd) as above.

If you use halfwave rectification then make Dc = Ad/360, and make sure you have enough capacitance to qualify as low ripple.

I hope this answers your question.
VSR
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.