View Single Post
  #31   Report Post  
Andrew Gabriel
 
Posts: n/a
Default

In article ,
Capitol writes:
Both the US and British voltages are a result of history. The British
started with 100V IIRC, (100V was picked (by Edison)as it is a nice
round number!) then moved up to 220V DC as it was cheaper to distribute
power at the higher voltage. Parts of Camden Town in London were only


Most of the early generation was at 100V. It suited arc lamps
and the filament lamps of the time. Trouble was it wouldn't
reach further than a few city blocks. I don't think there's
any real concensus why many of the UK power stations switched
to 200V and the US didn't, but one reason which does seem to
have more credibility than others is as follows. Very early
on, many London boroughs banned overhead cables (and those
bans are still in effect today). This forced the cabling to be
underground in much of London, but it was mostly on poles in
the US at the time. Now what do you do when you have a local
supply infrastructure that no longer meets the local demand?
In the US, it was easy, you just string thicker cables on the
poles. In London, with the cables all being underground, this
wasn't possible without doing a complete new installation.
So the way round the problem was to double the voltage, and
we got lots of 200V supplies. Over the years, the voltage has
continued to creep up in small increments. By the late 1950's,
most of the UK mains final supply voltage was in the 200-250VAC
range depending what town you lived in, and it was all changed
to 240VAC in the early 1960's. Some areas with lots of cinemas
also retained DC supplies for some commercial customers -- much
of central London had a 220VDC supply until the late 1970's.

converted from this in 1958. When the power supply world moved on, in
the UK, AC was introduced and AIUI the available transformer stampings
were optimal at 50Hz. The US came in with AC a bit later IIRC, and by
then the laminations would sustain 60Hz, reducing the transformer size(
& cost) and centre tapping the 230V allowed the use of the old
distribution circuits and products without upgrading. If we were
starting again today the world would probably settle for 230V @ 400 Hz,
giving smaller ( &cheap) transformers without significantly increased
losses.


400Hz severely reduces the maximum area of a synchronisation zone,
which makes carrying power any distance very much more expensive.
Many of the 50Hz zones are close to their size limit now, so I
don't think anyone would think of distributing at any higher
frequency in Europe. It would be OK on a small isolated island.
400Hz isn't suitable for industrial motors either. Actually, supplies
to large commercial customers at 16 2/3rds Hz and 25Hz used to be
quite common as they much prefer a lower frequency for large motors.
Transformer size is really only an issue on planes and boats, which
often do use 400Hz.

--
Andrew Gabriel