View Single Post
  #90   Report Post  
Capitol
 
Posts: n/a
Default



Jim Michaels wrote:

Some where in this thread!

UK Ten 100watt lamps at 240V equal 4.166 amps on circuit rated at 6
amps with 1mm wire.

US Five 100 watt lamps at 120V equals 4.166 amps on 15amp rated
circuit with 14gauge (2.08mm) wire.

In this example the US system has a massively greater safety margin.


This is comparing apples with plums!

Try 10 x 100w @240V and 10 x 100w @120V Currents are 4.2A and 8.4A.
I2R Losses are in the ratio 17.6:35 , allowing for the greater cable
areas at 120V. The US has to have thicker cables to attempt to
compensate for the higher currents, but from this example the
temperature rise in the US cables for the same load will be more than
double as the tc of copper is positive! Particularly true in the South
with their much higher daily temperatures. I believe that UL cables are
using higher temperature resistant coverings than the UK, perhaps
someone knows this?

Incidentally, my pocket reference for US wiring specifies 12g 3.3mm2 as
being the minimum required size for US homes today(NEC code) and the
maximum voltage drop as being 2%. For the example above a likely wire
gauge would be 9g! 6.6mm2! With only radial wiring, the cost penalty is
phenomenal!

Just a few comments.

Regards
Capitol