View Single Post
  #191   Report Post  
Posted to uk.d-i-y,uk.media.tv.misc,uk.tech.digital-tv,uk.tech.broadcast
Bill R[_4_] Bill R[_4_] is offline
external usenet poster
 
Posts: 1
Default Switch off at the socket?

Steve T wrote "savings are much less than the green pundits claim."

British Gas recently gave me a 'Real Time Electricity Monitor". I plugged
it in and it, typically, registered about 17 watts. This was lower than
expected and didn't move when I put on a few 100 watt lights. I suspected
it was faulty and returned it for replacement. When the replacement arrived
I plugged it in it, typically, registered about 33 watts usage - even with a
few lights on.

I spoke to someone from the 'Electricity Efficiency Team'. He tried to tell
me that the reading was an average reading over any hour. I pointed out
that the measurement unit was watts not watts/hour. He then went away to
the manufacturers who came back with the concept of 'power factor' (you can
read about it at en.wikipedia.org/wiki/AC_power) and that, because of PF,
consumers were charged only for the power actually used and that this varied
according to the type of appliance and that this is less than the rated
wattage of appliances. My schoolboy knowledge of a watt being voltage
divided by amperage is obviously wrong.

I cannot get my head around the concept of power factor and, as there seems
to be no answer for the large discrepancy in the reading between the two
meters the whole thing seems to be a bit of a fudge. Anyway, assuming all
this to be true how does my consumer meter know how much electricity is
being effectively used.

Bringing this back to the previous post, if the concept of power factor
really does effectively reduce the actual amount of power used why are we
being urged to replace tungsten bulbs in favour of the new bulbs. The
difference in wattage may be far greater overstated than the actual
difference.

Bill Ridgeway