Domestic mains voltage and frequency

Status
Not open for further replies.
Hi to u all,why do some countries use low mains voltage @110 volts whereas others use the more common 220 volts source of power (excluding UK which powers equipment@240 volts.).Has it to do with the landscape over which high power transformers and cables areset up?or was it an arbitrary decision when the electric circuit was conceived?Also,what is the advantage of using 60Hz frequency mains voltage in lieu of 50Hz one?.Awaiting your comments and thanking you for these,regards,Revenant.
 
Higher voltages are all about lowering distribution losses (I^2R loss).
Lower voltages tend to be about human safety.

My understanding based on my research in the 1980's as follows...

240Vac just happens to be in the human electrocution sweet spot - skin conduction occurs without significant charring allowing conduction for as long as contact is made, increasing the risk. Typical dry skin resistance is about 600R. Typical cardiac fibrillation threshold is about 30mA for 30mS with the current path through the upper body.

At 110Vac dry skin resistance first needs to be overcome before significant current flows and that initial current CAN be sub-lethal. Wet skin is of course generally lethal.

415Vac (which you will find in 3 phase 240V schemes) will cause significant charring which forms a useful insulator. However tissue damage CAN be so significant to become critical.

High Voltage schemes (>1kVac) will punch through any initial dry skin resistance more or less instantly and touching a conductor will nearly always be lethal but mostly through tissue damage.

Very High Voltage schemes (circa 30kVac) will arc to an earthed human body well before a conductor can be touched. Conduction paths will be across the surface of the skin as well as through internal tissues, both causing very significant charring.

Can't believe I was able to regurgitate all this nearly useless info after all this time.
 

PRR

Member
Joined 2003
Paid Member
The US started electrification before there were good insulators. Tom Edison found fewer dead workers around 100V equipment than higher voltage stuff. He aimed for 100V in the users' premises. Which really meant 110V or more at the dynamos to cover line losses even on quite short lines (a mile).

Most of the rest of the world electrified later, with better insulators. Higher voltages mean thinner copper for the same power, thus lower cost.

There were also issues with incandescent lamps. A low-Watt 230V filament is so long and skinny that it breaks easily. Again, the later electrification in most of the world allowed better solutions for higher voltage.

Note that most "120V" systems get power from the street at 240V CT. So there is not a large difference in the feeder cables. While this assumes that 120V loads will be spread equally on both sides of 240VCT, the 120V loads are not usually the big consumers. Cooking and heating is done with 240V.

Frequency: nearly everything from 16Hz to 133Hz and even 400Hz has been used. LARGE motors want lower frequencies. The old iron for transformers favored lower frequencies. The wide use of electric lamps favored 50/60 at least to limit flicker.

On the whole there are only about two systems. US G.E., and euro Siemens. Many-many other players have contributed to these systems, and there are still some significant variations even after "rationalization". But those two systems pretty much set the standards.

If you want to carry power a l-o-n-g way you do not do it at 240V. Even my 500 foot 240V line is too saggy for most folks. To carry power from the dam a dozen miles away they run 20,000V. The dam hardly runs now, the long run to the boilers is probably 66KV or 132KV.
 
(excluding UK which powers equipment@240 volts.).
The voltage used throughout Europe (including the UK) has been harmonised since January 2003 at a nominal 230V 50Hz (formerly 240V in UK, 220V in the rest of Europe).

Due to the new 'harmonised voltage limits' there was effectively no need for electricity supply companies to actually change the supply voltage.

To cope with the new limits in both the UK and the rest of Europe all modern equipment must be able to accept 230V +/-10% i.e. 207-253V.
 
400Hz is the standard for aircraft supplies, where the alternator needs to be small light and efficient, 3 phase 400Hz can be generated with a 12 pole 6000rpm machine. Faster rpm allows smaller/lighter machine(*), and the life expectancy between services can be quite small.



Large alternator sets can have more poles and will be lower rpm, being larger.


(*) torque in any electric machine scales with rotor volume, power scales with torque times rpm, so high power low mass requires high speed and thus high frequency.
 
(*) torque in any electric machine scales with rotor volume, power scales with torque times rpm, so high power low mass requires high speed and thus high frequency.

Does it? Well yes, when simplified.

You wont see* (or rather I dont see) too many "low mass, high speed" rotors...peripheral speeds generally mean long rotor, small diameter, reducing MOI.

(IIRC I think a typical 300MW class GTA 2 pole rotor weighs about 100t)

Torque in these cases, I imagine, is something to be managed and kept low enough to prevent stalling the prime mover. EDIT: in the case of an induction motor, rotor inertia and torque need to be sufficient to cover the load torque requirement, without excessive slip - less than pull out torque.

In aircraft alternators, 400Hz is used generally because the wiring requirements are reduced, reducing the weight of cabling required. At least that's what I was taught.

Other than that, the only AC 'motors' I can think of that would need to have torque at high speeds would be compressor motors (and magnetic bearings)

FWIW my harmonised UK AC averages out at 254V most times

Though, admittedly I only test the things, not design them.
 
Last edited:
So let's say you were creating your own, brand-new, micro-power solar/wind/whatever system for your off-grid dream home. :) Assume equal access to (and cost of) 115V/60Hz and 230V/50Hz equipment/appliances. What would you use as your primary household power, and why?
 

PRR

Member
Joined 2003
Paid Member
> your off-grid dream home.

Current<g> thinking is 400V of DC.

You can't store AC in a battery. AC/DC/AC conversion, even at today's prices, is costly and lossy. If talking of a "home", AC's advantage of being stepped-up for long range transmission (where Edison's DC system failed) is moot.

400Hz is for short-run systems like airplanes. 400Hz dynamos and transformers are very much smaller than 50/60Hz gear. The disadvantage is that you can't push 400Hz over miles of wire. In my 500 foot 60Hz feeder, line inductance is a minor correction, at 400Hz it would dominate resistance. But in a 100-foot airplane (the size they were when 400Hz was adopted) the inductance is still minor; still tolerable in larger aircraft.
 
Me?
That's a Pipe dream Haha.

Probably 240V or in that region. Off grid it probably easier to rectify and store in use battery storage. I'd guess that I^2R losses are also less using high voltage batteries, as the current drawn through battery resistance is less. I'm not an inverter expert, but I again guess that losses from the rectify->DC link->PWM would be less (but I could very well be wrong)

But then I'd separate lighting. Theres no reason lighting cant be run at ELV, say 12 or 24Vdc, from part of the whole battery of storage cells.

Some incredible HVDC work going on at the moment, back to thermionic devices, in a new wave kinda way.
 
...you can't push 400Hz over miles of wire.
There are now high voltage DC transmission lines, used when lots of power needs to be transported a long distance.

1250 kilometres of wire is a quarter-wavelength radio antenna at 60 Hz, and so it radiates away a good chunk of the power that it's supposed to be transporting to the faraway load. The power is lost as radio waves, traveling away at the speed of light from the wire. :eek:

The cure is to use DC, which doesn't radiate away at all: Benefits of High-Voltage Direct Current Transmission Systems

Wikipedia has more on the subject: High-voltage direct current - Wikipedia


-Gnobuddy
 
AC versus DC - I wonder how scientifically accurate the new movie is?
 

Attachments

  • The Current War.jpg
    The Current War.jpg
    49.6 KB · Views: 246
Gnobuddy said:
1250 kilometres of wire is a quarter-wavelength radio antenna at 60 Hz, and so it radiates away a good chunk of the power that it's supposed to be transporting to the faraway load. The power is lost as radio waves, traveling away at the speed of light from the wire.
Very little radiation takes place. The wire might be a quarter-wave long but it is very near the ground so you have a microstrip transmission line. Losses occur in the ground, because it is a lossy dielectric and a poor conductor. Buried AC cables have greater losses because the ground is nearer, which is why DC has been used for decades for undersea links.
 
If starting from scratch, I would go DC. One major advantage of high voltage DC grid not mentioned too often in my view is better grid stability. In the AC networks today it is getting more and more difficult to keep the network stable as in an AC grid, all generators powering the grid need to pe precicely synced in frequency which is not trivial. The big generators with hundreds of tons of rotor mass can only be accelerated or decelerated with very slow ramps. After a major power down starting the grid again is a major undertaking.
And in our daily use of electrical equipment the majority of devices have switched mode power supplies for efficiency reasons - AC first gets converted to DC, then converted to a higher frequency AC and then converted to DC again. Starting with DC would eliminate the first step.
 
The standard now for grid tie systems is 600V or more DC into the inverters. I suspect the GAN and other high voltage power devices have changed everything.

Hi Scott,

I was about to link the article that mentioned "valves" but it may have been a translation error, and besides I cant find it now.

I'm aware that new (or redeveloped) thyristors are being developed, but yet I find it mind boggling how the switching device, whether solid state or vacuum, can cope with 100's of kV!
 
AC versus DC - I wonder how scientifically accurate the new movie is?

Hahaha well that depends on which side of the Atlantic you reside!!!

(Local road is Edison drive, this factory was built back when Westinghouse and Edison argued the toss, and tortured Elephants)

I'm sure that the cinema record will convince the world that everything was discovered in America, (like the telephone, a Scottish invention, or the lightbulb; another British invention)
 
Last edited:
There are now high voltage DC transmission lines, used when lots of power needs to be transported a long distance.

1250 kilometres of wire is a quarter-wavelength radio antenna at 60 Hz, and so it radiates away a good chunk of the power that it's supposed to be transporting to the faraway load. The power is lost as radio waves, traveling away at the speed of light from the wire. :eek:

The cure is to use DC, which doesn't radiate away at all: Benefits of High-Voltage Direct Current Transmission Systems

Wikipedia has more on the subject: High-voltage direct current - Wikipedia


-Gnobuddy

Yes this is what I was getting at! 400kV DC!!!!

Now I'd love to know what those switchers look like!
 
Status
Not open for further replies.