Amplifier with 220 input voltage vs 230 ?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Did you mean if you buy an amp which runs on 220V and plug that into 230V?

The issue is in my view with the transformers. Good designs will be able to accept all power supply variations, which can vary quite substantially. In the UK the electricity supply has been "normalised" to 230V but in fact because of the tolerances the actual voltage stays at 240V. IF transformers have been designed to run on 220V +/- 6% then 230V MAY just exceed its rating IF this were a little high. In the UK the "230"v is -6,+10% meaning anything from 216 to 253V. It is unfortunate that commercialisation generally means cheap and this means that many transformers no longer have taps to adjust for a few volts up or down on the nominal supply. On the other hand a transformer designed without taps has to be able to operate from a wider (and possibly higher) input voltage than nominal if it is to be used anywhere, which means that the output voltage may vary too. As your previous correspondent noted this may affect valves (heater supplies running hotter), but my point is whether the transformers will work full stop.

UNfortunately this means it is not a case of "yes or no" but "it depends on the design of the transformer"...as well as "the design of the equipment".

You might find someone who could test the unit on a variable supply and make sure that the transformer operates correctly on the highest possible supply voltage without saturating (which would need a scope and resistor to monitor load currents for example, but maybe with an isolating transformer as well...)
 
Thanks for the great input ! If I would lie to play safe, what should I purchase to prevent it ?


Did you mean if you buy an amp which runs on 220V and plug that into 230V?

The issue is in my view with the transformers. Good designs will be able to accept all power supply variations, which can vary quite substantially. In the UK the electricity supply has been "normalised" to 230V but in fact because of the tolerances the actual voltage stays at 240V. IF transformers have been designed to run on 220V +/- 6% then 230V MAY just exceed its rating IF this were a little high. In the UK the "230"v is -6,+10% meaning anything from 216 to 253V. It is unfortunate that commercialisation generally means cheap and this means that many transformers no longer have taps to adjust for a few volts up or down on the nominal supply. On the other hand a transformer designed without taps has to be able to operate from a wider (and possibly higher) input voltage than nominal if it is to be used anywhere, which means that the output voltage may vary too. As your previous correspondent noted this may affect valves (heater supplies running hotter), but my point is whether the transformers will work full stop.

UNfortunately this means it is not a case of "yes or no" but "it depends on the design of the transformer"...as well as "the design of the equipment".

You might find someone who could test the unit on a variable supply and make sure that the transformer operates correctly on the highest possible supply voltage without saturating (which would need a scope and resistor to monitor load currents for example, but maybe with an isolating transformer as well...)
 
Coconuts,
There's the problem!
You are assuming that a universal 220/240Vac transformer is designed to operate on the full range of supply voltages
In the UK the "230"v is -6,+10% meaning anything from 216 to 253V.
but you quote an example that is not universal.
all 220V equipment will work flawlessly with 230 (and up to 242) volts.

All transformers, no matter how well they are designed, do not draw a magnetising current that is proportional to the supply voltage. The current variation with supply voltage is non linear. Transformers get hot when the supply voltage is too high, i.e. they draw more current than expected. That extra current does not transfer to the output, it gets dissipated as extra heat in the core.
 
One approach I have seen is to use a low voltage transformer- say 12v -which is abnle to work on the avaiulable mains (e.g. 230V) but can drive enough current for the load (which ought to be possible) and connect the secondary in series but out of phase with the mains, which will effectively reduce the mains voltage by 12V.

You will need to make sure that the polarity is right and that you don't add another 12V to the mains instead of subtracting!

I do not recommend this approach myself, but it has been suggested by others.
 
It depends on what you ACTUAL mains voltage is. The quoted voltage is the nominal, but often the spec sometimes includes asymmetrical tolerance:
eg (rounded figures):
UK: Nominal 230v -6% +10% (216v ~ 253v)
Parts of EU: Nominal 230v -6% +6% (216v ~ 244v)
This allowed "harmonisation to 230v" without any of the countries having to change anything. Many areas/states still supply their original voltage, whether 220v or 240v.

I suggest you test your actual mains voltage on 3 occasions and find the mean. Include one peak time measurement. Armed with this figure you can decide whether it it worth making any changes.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.