View Single Post
  #9   Report Post  
Posted to alt.home.repair
trader_4 trader_4 is offline
external usenet poster
 
Posts: 15,279
Default Wattage of Todays Computers V/S Older ones

On Thursday, February 18, 2016 at 7:02:26 AM UTC-5, wrote:
On Thu, 18 Feb 2016 04:30:22 -0600, philo wrote:

On 02/18/2016 04:02 AM, wrote:
Today's computers with their Dual or Quad core processors, and lots of
Ram, extended video cards, high wattage power supplies and so on, appear
to use a lot more input wattage and amps from an outlet, compared to the
common Pentium 1 series computers sold around the year 2000.

I am using such a computer, which was new in 2000. It has 512mb Ram
(Maximum it can use), came with a 100W power supply, (Which died and I
replaced with a 300W), and it has 2 fans, one in the power supply, and a
tiny one on the CPU.

These modern dual/quad core machines MUST require considerable more
wattage to run. Most have 3 or 4 fans and kick out a lot of heat
compared to the slight warmth I feel from my fan.

However, I'm looking for some actual facts about how much power is used
in this comparison. (A 2015/16 computer VS one from around 2000). I'd
like to find some actual data.

While computers are low energy users compared to many other appliances,
they are often left running around the clock, and that adds up.

One saving point is that the old CRT monitors used more power than the
modern flat screen ones which use either LED or Florescent to light the
screen.

Yet, even when the monitor is considered into the overall picture, I
suspect that modern computers still use considerably more power, just
based on the heat they kick out. (Of course in winter, that heat adds
some heat to the house, but in Summer is will require more cooling of
the home.





It's exactly the opposite of what one would think but todays machines
are often considerably more power efficient.

A few years ago I built a quad core machine for my wife and the mobo
power draw was something like 25 watts


That does come as a surprise. I thought my 300W power supply would
handle most anything. I probably killed the original PS by plugging in
too many hard drives and other add ons.

But I guess I was wrong on that note, because I recently read that most
computers now have 500 or 600 watt or larger power supplies. That alone
would tell me that the new computers suck lots of power. So if your mobo
is only using 25W and hard drives and similar items are using about the
same as they always did, why are power supplies so much larger now?

I have not actually opened or used any computer newer than maybe 2010 or
so, but I had to help someone fix a Dell from around 2010 (dual core),
and that thing could have been used as an electric heater. The CPU alone
was hot enough to fry an egg on it, because one of the 4 fans died.

Maybe they have become more efficient since 2010, because that beast
must have sucked about 500w just to play simple solitare that came with
Win XP, (running XP Home) because that's about the only thing that
person did with their computer, yet it was pouring heat out of the case
like a 1500watt space heater. (and amazingly, was six (or more) times
slower than my year 2000 pentium made for Windows 2K).

One of the main reasons I dont want to upgrade (besides not liking
bloated operating systems), is because my electric bill is already very
high, and go on and off the computer daily between other work I do, so
it dont make much sense to turn it on and off 10 or more times each day.
So I just leave it on (except the monitor, which I shut off, when I'm
not using the computer).


That does come as a surprise. I thought my 300W power supply would
handle most anything. I probably killed the original PS by plugging in
too many hard drives and other add ons.

But I guess I was wrong on that note, because I recently read that most
computers now have 500 or 600 watt or larger power supplies. That alone
would tell me that the new computers suck lots of power. So if your mobo
is only using 25W and hard drives and similar items are using about the
same as they always did, why are power supplies so much larger now?

I have not actually opened or used any computer newer than maybe 2010 or
so, but I had to help someone fix a Dell from around 2010 (dual core),
and that thing could have been used as an electric heater. The CPU alone
was hot enough to fry an egg on it, because one of the 4 fans died.

Maybe they have become more efficient since 2010, because that beast
must have sucked about 500w just to play simple solitare that came with
Win XP, (running XP Home) because that's about the only thing that
person did with their computer, yet it was pouring heat out of the case
like a 1500watt space heater. (and amazingly, was six (or more) times
slower than my year 2000 pentium made for Windows 2K).

One of the main reasons I dont want to upgrade (besides not liking
bloated operating systems), is because my electric bill is already very
high, and go on and off the computer daily between other work I do, so
it dont make much sense to turn it on and off 10 or more times each day.
So I just leave it on (except the monitor, which I shut off, when I'm
not using the computer).


There are multiple factors at work here. While the CPUs have
grown far more complex, transitor count has grown exponentially,
the feature size has continued to shrink, which is what makes
that possible. So, more complexity would mean more power, but
smaller transistors take less power. That same driving force
has lowered the voltage they operate at from 5V to 1V, which
obviously is a big factor. Overall, I think you'd find
that today's similar purpose CPUs probably take the same or less
power to run than their predecessors from 25 years ago. At the
same time, CPUs for special purposes, eg notebooks, have used
technological progress to tradeoff some compute power for much
lower power consumption.

I just checked the power supply on my 6 year old HP I7 based
PC and it's 460W. That's about the size that power supplies
have been all along. But I would think it's also possible
that while they need that as a theoretical peak, the I7
isn't using 4 cores and running anywhere near max power as
I sit here and browse the internet or do similar. I'd bet
that today's PC typically runs at lower power actually being
used than an old 386 system. Both probably had about the same
max size power supply though. PC and chip manufacturers have
been working to put in energy saving features, make them
energy star compatible, for decades now.

To save power, do you have the power settings set correctly
in control panel? I have mine set to put the display to
sleep and the CPU to sleep after 15 mins of no activity.
It wakes up in about 2 secs, by touching any key. That should
save you a reasonable amount of power, with no real downside.