View Single Post
  #33   Report Post  
Posted to uk.telecom.mobile,uk.d-i-y,alt.sci.physics
lyttlec lyttlec is offline
external usenet poster
 
Posts: 4
Default Southern Electric say each mobile phone charger uses 100kWhoursa DAY!

Bob Eager wrote:
On Thu, 15 Feb 2007 00:03:41 UTC, lyttlec wrote:

0.8w*24hr/day*300,000,000 TV's = 5,760,000,000kw-hr/day in tv's only.
Now add in cell phone chargers, computers, and stereos. All that energy
wasted so we save 1sec when we want to get our daily dose of
mind-numbing drivel.
You're another one who likes to distort the truth, then? You might make
it a bit more convincing by at least trying to do the calculation
correctly.

0.8w*24hr/day = 19.2 watt/hours per day for one TV. Multiply that by
300,000,000 (where did you get the 5 TVs for every man, woman and child
in the country from?, but we'll leave that one) and you get 5,760,000
kwH per day.

So, you're three orders of magnitude out (a factor of 1000, in other
words). That's using a false assumption of 5 TVs each for everyone in
the UK; perhaps you meant some bigger area, but who knows? you
conveniently left that out.

Add in the fact that for a lot of that 24 hours the TV would be on
anyway. Then also the fact that many people turn the TV off when asleep
or not at home.

Greenwash. And highly unconvincing greenwash.

Ok, I did type kw-hr when I did mean w-hr. my bad.


All very well, but it makes a hell of a difference. "Sorry, guv, made a
mistake. But it's really bad anyway".

OTOH, I should have used 600,000,000 for the number of TVs. The current
estimate for the US population is just over 300,000,000 and we have
closer to tvs per person.


So, you're in the USA; I suspected as much. But try to make it clear;
this is a UK newsgroup after all.

I assume you still get billed for on-peak vs. off-peak power? I once
built a wind power system for a friend of mine in the UK. Cut his peak
power usage charges in half.
In the US, I built one for a dairy farm that cut his peak-power by 10%,
and his total bill by over 80%. His total monthly rate was determined by
a 20minute peak. Some simple changes to his local wiring ( so he
couldn't start the welder in his shop while the silage augers were
running) and adding a wind power system to pump and pre-heat water was
all it took.
Two TVs for every man, woman, child and baby? Well, we always knew the
Yanks were wasteful.

The Brits isn't any better. There are just fewer of them. To your
credit, power rates there do encourage a bit more conservation.
And as for pollution: can we say "Bush - Kyoto"?

Both suck. Ignore anything any government says. The real solution is
simple : turn it off, and ask yourself if you ever need to turn it on again.
I know several families of three that have one
in each bedroom, one in the kitchen, one in the living room, one in the
den. Even if no one ever watches the one in the guest room, it is still
using 0.8W.


If it's switched on at all. My experience is that yes, TVs get left on
standby. But not 24/7. At least not in the UK. Typical day; TV is
switched on when we get up. It goes off when I leave for work (I'm last
out). Goes on when the kids come in. On and off during the evening,
possibly on standby sometimes. Total time on standby probably less than
an hour or two each day. You can't count the time when it's actually
being watched, unless you are bent on distorting the figures.

As long as it is plugged into the wall, it is on standby. The on-off
switch no longer turn them off. On one Phillips TV series I worked on,
the on-off switch simply blanked the video and sound. Every thing else
was powered up. Power use in 'on" and "off" was almost exactly the same.
So the correct number is 11,520,000 watt-hours/day.


No; 11,520,000 kWh/day (11,520 MWh/day). Do try to keep up.

Or approximately 1000MWh per day, in reality (dividing by 12 since it's
really about 2 hours/day on average; some will be more, some less).
Actually, that's probably a high estimate; I can't see very many people
leaving (say) the guest room TV on when there is no guest. And are there
*really* that many - two for everyone?

Yes, two for everyone. Counting the ones at home, in the bars (the
coffee shop where I am now has two running), airports, the displays
running in the stores, etc., etc.
That's for the USA. For the UK (this is a UK newsgroup) it's about a
fifth of that. So, 200MWh per day. Worst case - I doubt we have as many
as the USA. Be generous and say half. So a (probably high) figure is
100MWh per day.

100MWh/day is still a sizable power plant.

Our last electricity bill shows we used 30 kWh/day; we might use a bit
more than most, but let's say on average it's a third of that. So,
10,000 homes worth of electricity. Out of probably about 20 million,
that's not a lot. Even less if you count industry, which is very power
hungry.

So it's ok to be wasteful because someone else is worse? You can throw
your Starbucks cup on the street because it's only one cup?

So, just scare mongering figures really.

Not scare mongering, just shaming you. I spend one day a week picking up
trash out of the nice creek that crosses my farm. I pick up over hundred
pounds per month of plastic soda bottles, foam cups, plastic bags from
the grocer. I carry cloth bags to the grocer, which confuses the clerks
no end. Even though I avoid plastic as much as possible ( only buying
beer in aluminum cans or re-usable kegs), I still calculated that I use
the equivalent of one barrel of oil per year in plastic. That translate
to almost a million barrels per day here. Based on my last visit to the
UK (just before 9/11), you aren't all that much better there.

That
translates to about 3,417,600 pounds(1709tons) of coal/year, 24,000
pounds of SO2, 1600 tons of CO2 (coal is mostly carbon).


Actually, it's a damn sight more CO2 than that since there isn't a one
to one relation.

Correct, but you get the picture. Carbon + O2 weighs more than the coal
(carbon + ash + sulfur +...)

Have you ever noticed that during a blackout, the power sort of fade out
rather than goes off all at once? Thats because of all the wallwarts and
tv sets discharging into the grid.


I'd like to hear technical justification for that. I've not seen it, and
I guess it depends on what caused the blackout anyway.

sure. In the past the biggest contributor was electric clocks which
change from motors to generators when the grid is lost. Now it is energy
stored in power supplies. Wall warts have transformers that store energy
that gets dumped back into the grid when grid power is lost. Anything
with a ps does the same. The per-unit energy is less now that we use
switching ps rather than the older inductor (transformer) based designs,
but we have lots more units.

Look up inductors and capicators in an elementary circuit analysis text.
One exercise will probably be to build an r-l-c circuit and measure the
voltage across the elements when power is connected and disconnected.
Another thing to try is to start up an AC motor and monitor the voltage
across the power terminals when the power is disconnected.