Home Repair (alt.home.repair) For all homeowners and DIYers with many experienced tradesmen. Solve your toughest home fix-it problems.

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #1   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 901
Default Wattage of Todays Computers V/S Older ones

Today's computers with their Dual or Quad core processors, and lots of
Ram, extended video cards, high wattage power supplies and so on, appear
to use a lot more input wattage and amps from an outlet, compared to the
common Pentium 1 series computers sold around the year 2000.

I am using such a computer, which was new in 2000. It has 512mb Ram
(Maximum it can use), came with a 100W power supply, (Which died and I
replaced with a 300W), and it has 2 fans, one in the power supply, and a
tiny one on the CPU.

These modern dual/quad core machines MUST require considerable more
wattage to run. Most have 3 or 4 fans and kick out a lot of heat
compared to the slight warmth I feel from my fan.

However, I'm looking for some actual facts about how much power is used
in this comparison. (A 2015/16 computer VS one from around 2000). I'd
like to find some actual data.

While computers are low energy users compared to many other appliances,
they are often left running around the clock, and that adds up.

One saving point is that the old CRT monitors used more power than the
modern flat screen ones which use either LED or Florescent to light the
screen.

Yet, even when the monitor is considered into the overall picture, I
suspect that modern computers still use considerably more power, just
based on the heat they kick out. (Of course in winter, that heat adds
some heat to the house, but in Summer is will require more cooling of
the home.


  #2   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,399
Default Wattage of Todays Computers V/S Older ones

On 02/18/2016 04:02 AM, wrote:
Today's computers with their Dual or Quad core processors, and lots of
Ram, extended video cards, high wattage power supplies and so on, appear
to use a lot more input wattage and amps from an outlet, compared to the
common Pentium 1 series computers sold around the year 2000.

I am using such a computer, which was new in 2000. It has 512mb Ram
(Maximum it can use), came with a 100W power supply, (Which died and I
replaced with a 300W), and it has 2 fans, one in the power supply, and a
tiny one on the CPU.

These modern dual/quad core machines MUST require considerable more
wattage to run. Most have 3 or 4 fans and kick out a lot of heat
compared to the slight warmth I feel from my fan.

However, I'm looking for some actual facts about how much power is used
in this comparison. (A 2015/16 computer VS one from around 2000). I'd
like to find some actual data.

While computers are low energy users compared to many other appliances,
they are often left running around the clock, and that adds up.

One saving point is that the old CRT monitors used more power than the
modern flat screen ones which use either LED or Florescent to light the
screen.

Yet, even when the monitor is considered into the overall picture, I
suspect that modern computers still use considerably more power, just
based on the heat they kick out. (Of course in winter, that heat adds
some heat to the house, but in Summer is will require more cooling of
the home.





It's exactly the opposite of what one would think but todays machines
are often considerably more power efficient.

A few years ago I built a quad core machine for my wife and the mobo
power draw was something like 25 watts
  #3   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 259
Default Wattage of Todays Computers V/S Older ones

philo wrote:
....
It's exactly the opposite of what one would think but todays machines
are often considerably more power efficient.

A few years ago I built a quad core machine for my wife and the mobo
power draw was something like 25 watts


the higher end processors get up to around double or triple
that, but i think the most power used these days is in the
graphics boards that some people want for doing gaming and
number crunching or bitcoin mining.

so if you really want to have a very efficient computer
go with on board graphics and don't expect it to scream,
but i think it would still get the job done for most people
doing internet, e-mail, watch a dvd/blueray, listen to some
music, etc.

as i'm looking to eventually replace this beast i'll be
doing much the same evaluation. i'd like it to be as
energy efficient as possible and also quiet would be nice.


songbird
  #4   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 901
Default Wattage of Todays Computers V/S Older ones

On Thu, 18 Feb 2016 04:30:22 -0600, philo wrote:

On 02/18/2016 04:02 AM, wrote:
Today's computers with their Dual or Quad core processors, and lots of
Ram, extended video cards, high wattage power supplies and so on, appear
to use a lot more input wattage and amps from an outlet, compared to the
common Pentium 1 series computers sold around the year 2000.

I am using such a computer, which was new in 2000. It has 512mb Ram
(Maximum it can use), came with a 100W power supply, (Which died and I
replaced with a 300W), and it has 2 fans, one in the power supply, and a
tiny one on the CPU.

These modern dual/quad core machines MUST require considerable more
wattage to run. Most have 3 or 4 fans and kick out a lot of heat
compared to the slight warmth I feel from my fan.

However, I'm looking for some actual facts about how much power is used
in this comparison. (A 2015/16 computer VS one from around 2000). I'd
like to find some actual data.

While computers are low energy users compared to many other appliances,
they are often left running around the clock, and that adds up.

One saving point is that the old CRT monitors used more power than the
modern flat screen ones which use either LED or Florescent to light the
screen.

Yet, even when the monitor is considered into the overall picture, I
suspect that modern computers still use considerably more power, just
based on the heat they kick out. (Of course in winter, that heat adds
some heat to the house, but in Summer is will require more cooling of
the home.





It's exactly the opposite of what one would think but todays machines
are often considerably more power efficient.

A few years ago I built a quad core machine for my wife and the mobo
power draw was something like 25 watts


That does come as a surprise. I thought my 300W power supply would
handle most anything. I probably killed the original PS by plugging in
too many hard drives and other add ons.

But I guess I was wrong on that note, because I recently read that most
computers now have 500 or 600 watt or larger power supplies. That alone
would tell me that the new computers suck lots of power. So if your mobo
is only using 25W and hard drives and similar items are using about the
same as they always did, why are power supplies so much larger now?

I have not actually opened or used any computer newer than maybe 2010 or
so, but I had to help someone fix a Dell from around 2010 (dual core),
and that thing could have been used as an electric heater. The CPU alone
was hot enough to fry an egg on it, because one of the 4 fans died.

Maybe they have become more efficient since 2010, because that beast
must have sucked about 500w just to play simple solitare that came with
Win XP, (running XP Home) because that's about the only thing that
person did with their computer, yet it was pouring heat out of the case
like a 1500watt space heater. (and amazingly, was six (or more) times
slower than my year 2000 pentium made for Windows 2K).

One of the main reasons I dont want to upgrade (besides not liking
bloated operating systems), is because my electric bill is already very
high, and go on and off the computer daily between other work I do, so
it dont make much sense to turn it on and off 10 or more times each day.
So I just leave it on (except the monitor, which I shut off, when I'm
not using the computer).


  #5   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,405
Default Wattage of Todays Computers V/S Older ones

On Thu, 18 Feb 2016 04:02:58 -0600, wrote:

Today's computers with their Dual or Quad core processors, and lots of
Ram, extended video cards, high wattage power supplies and so on, appear
to use a lot more input wattage and amps from an outlet, compared to the
common Pentium 1 series computers sold around the year 2000.

I am using such a computer, which was new in 2000. It has 512mb Ram
(Maximum it can use), came with a 100W power supply, (Which died and I
replaced with a 300W), and it has 2 fans, one in the power supply, and a
tiny one on the CPU.

These modern dual/quad core machines MUST require considerable more
wattage to run. Most have 3 or 4 fans and kick out a lot of heat
compared to the slight warmth I feel from my fan.

However, I'm looking for some actual facts about how much power is used
in this comparison. (A 2015/16 computer VS one from around 2000). I'd
like to find some actual data.

While computers are low energy users compared to many other appliances,
they are often left running around the clock, and that adds up.

One saving point is that the old CRT monitors used more power than the
modern flat screen ones which use either LED or Florescent to light the
screen.

Yet, even when the monitor is considered into the overall picture, I
suspect that modern computers still use considerably more power, just
based on the heat they kick out. (Of course in winter, that heat adds
some heat to the house, but in Summer is will require more cooling of
the home.

My I7-4770 puts out only abut 70 watts. The entire wattage is about
450. You can use this to get watts.
http://outervision.com/power-supply-calculator
There are others.




  #6   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,399
Default Wattage of Todays Computers V/S Older ones

On 02/18/2016 06:02 AM, wrote:
On Thu, 18 Feb 2016 04:30:22 -0600, philo wrote:



snip


"songbird" mentioned that a good video card can take a lot of power and
that's true.

Most gamers would want the best they could afford.


I save money there ...as the only game I play is a 25 year old version
of Tetris



That does come as a surprise. I thought my 300W power supply would
handle most anything. I probably killed the original PS by plugging in
too many hard drives and other add ons.



There is no harm in going bigger than what you need, and though , if I
have one...will use a 500w supply...but for general use 300w should be fine.


Power supplies typically do not burn out from being overloaded, it's
usually a surge or that a component just plain fails.

Though I do a lot of computer repair work and a dead PSU is pretty
common...and sometimes I'll even see a PSU and mobo go at once
(eMachines) ...last year something kind of scary happened to one of my
own machines which was fortunately just a spare and had been backed up
in several places.


It had a good name brand 500w PSU and was on an industrial grade UPS but
it just plain failed upon being turned on...it took out two hard drives!
There is supposed to be built-in protection for that!

snip



One of the main reasons I dont want to upgrade (besides not liking
bloated operating systems), is because my electric bill is already very
high, and go on and off the computer daily between other work I do, so
it dont make much sense to turn it on and off 10 or more times each day.
So I just leave it on (except the monitor, which I shut off, when I'm
not using the computer).





My electric bill here is slightly on the high side in the winter because
my wife has a 2000watt baseboard heater in her studio. Because I keep
the house cooler than she likes, I don't mess with her own private
space. Even though there ore often tow or three computers on at a time,
they do not add a fortune to the bill...but I id give away quite a few
HUGE servers because I knew they would be major power hogs.


Anyway I have been using those inexpensive mobo/cpu combos and they work
well and take very little power. My wife's machine has 32 gigs of RAM
and she can easily run several "heavy" apps at once with no bogging down.
  #7   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 810
Default Wattage of Todays Computers V/S Older ones




Anyway I have been using those inexpensive mobo/cpu combos and they work
well and take very little power. My wife's machine has 32 gigs of RAM
and she can easily run several "heavy" apps at once with no bogging down.


Also it is surprising but a PC will draw more power when it is thinking hard.

Hook up a current meter or kill o watt etc to the PC then watch it. Grab the top of a window and slide it around the screen and watch the current.

Mark
  #8   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 1,270
Default Wattage of Todays Computers V/S Older ones

Today's computers with their Dual or Quad core processors, and lots
of Ram, extended video cards, high wattage power supplies and so on,
appear to use a lot more input wattage and amps from an outlet,
compared to the common Pentium 1 series computers sold around the year
2000.
I am using such a computer, which was new in 2000. It has 512mb Ram
(Maximum it can use), came with a 100W power supply, (Which died and I
replaced with a 300W), and it has 2 fans, one in the power supply, and
a tiny one on the CPU.
These modern dual/quad core machines MUST require considerable more
wattage to run. Most have 3 or 4 fans and kick out a lot of heat
compared to the slight warmth I feel from my fan.


The wattage rating of the power supply does not indicate the actual power
usage of the computer. Just like a 200A electrical panel doesn't mean
you're using all 200 amps, or a 200 watt stereo doesn't mean you're
cranking it to full power. That's just how much power it is capable of
producing.

Newer computers are actually a lot more energy efficient than the old
ones, despite the increase in speed.

My old computer pumped out a lot of heat, enough that it kept my feet
warm under my desk and the heater in my office rarely needed to run. My
new computer puts off so little heat I can't even feel it.

I'm looking for some actual facts about how much power is used
in this comparison. (A 2015/16 computer VS one from around 2000).
I'd like to find some actual data.


I have my computer on a UPS so I can monitor my exact power usage. I have
an i7-4790K, 16GB RAM, 2 SSD's, 2 hard drives, an external hard drive,
cable modem, router, and an older LCD monitor connected to it. Under
normal use the whole system averages around 90 watts. It will jump up to
150 watts or so when processing video, but it drops to about 60 watts
when I turn off the monitor and the hard drives power down.

For what it's worth, I have a 650 watt Antec power supply, so you can see
my actual 90 watt usage is no where close to the power supply rating.

My previous computer (based on an i5-2500K) averaged around 120 watts
under normal use. I know that was an improvement over my old Pentium 4
computer, but I don't remember how much it used.

While computers are low energy users compared to many other appliances,
they are often left running around the clock, and that adds up.


Yep, I leave mine on 24/7 since it controls lighting in our house, and
records TV shows. At 60 watts overnight that's about the same as leaving
a light bulb on.

I suspect that modern computers still use considerably more power


These days graphics cards are the one of the biggest power users,
especially high end cards used by gamers. I don't play games so I use a
fanless Asus GTX750-DCSL-2GD5 video card. It uses very little power and
is absolutely silent.

I also use Gelid FN-PX12-15 120mm fans for my CPU and case fans, and
they're virtually silent under normal use.

Anthony Watson
www.watsondiy.com
www.mountainsoftware.com
  #9   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 15,279
Default Wattage of Todays Computers V/S Older ones

On Thursday, February 18, 2016 at 7:02:26 AM UTC-5, wrote:
On Thu, 18 Feb 2016 04:30:22 -0600, philo wrote:

On 02/18/2016 04:02 AM, wrote:
Today's computers with their Dual or Quad core processors, and lots of
Ram, extended video cards, high wattage power supplies and so on, appear
to use a lot more input wattage and amps from an outlet, compared to the
common Pentium 1 series computers sold around the year 2000.

I am using such a computer, which was new in 2000. It has 512mb Ram
(Maximum it can use), came with a 100W power supply, (Which died and I
replaced with a 300W), and it has 2 fans, one in the power supply, and a
tiny one on the CPU.

These modern dual/quad core machines MUST require considerable more
wattage to run. Most have 3 or 4 fans and kick out a lot of heat
compared to the slight warmth I feel from my fan.

However, I'm looking for some actual facts about how much power is used
in this comparison. (A 2015/16 computer VS one from around 2000). I'd
like to find some actual data.

While computers are low energy users compared to many other appliances,
they are often left running around the clock, and that adds up.

One saving point is that the old CRT monitors used more power than the
modern flat screen ones which use either LED or Florescent to light the
screen.

Yet, even when the monitor is considered into the overall picture, I
suspect that modern computers still use considerably more power, just
based on the heat they kick out. (Of course in winter, that heat adds
some heat to the house, but in Summer is will require more cooling of
the home.





It's exactly the opposite of what one would think but todays machines
are often considerably more power efficient.

A few years ago I built a quad core machine for my wife and the mobo
power draw was something like 25 watts


That does come as a surprise. I thought my 300W power supply would
handle most anything. I probably killed the original PS by plugging in
too many hard drives and other add ons.

But I guess I was wrong on that note, because I recently read that most
computers now have 500 or 600 watt or larger power supplies. That alone
would tell me that the new computers suck lots of power. So if your mobo
is only using 25W and hard drives and similar items are using about the
same as they always did, why are power supplies so much larger now?

I have not actually opened or used any computer newer than maybe 2010 or
so, but I had to help someone fix a Dell from around 2010 (dual core),
and that thing could have been used as an electric heater. The CPU alone
was hot enough to fry an egg on it, because one of the 4 fans died.

Maybe they have become more efficient since 2010, because that beast
must have sucked about 500w just to play simple solitare that came with
Win XP, (running XP Home) because that's about the only thing that
person did with their computer, yet it was pouring heat out of the case
like a 1500watt space heater. (and amazingly, was six (or more) times
slower than my year 2000 pentium made for Windows 2K).

One of the main reasons I dont want to upgrade (besides not liking
bloated operating systems), is because my electric bill is already very
high, and go on and off the computer daily between other work I do, so
it dont make much sense to turn it on and off 10 or more times each day.
So I just leave it on (except the monitor, which I shut off, when I'm
not using the computer).


That does come as a surprise. I thought my 300W power supply would
handle most anything. I probably killed the original PS by plugging in
too many hard drives and other add ons.

But I guess I was wrong on that note, because I recently read that most
computers now have 500 or 600 watt or larger power supplies. That alone
would tell me that the new computers suck lots of power. So if your mobo
is only using 25W and hard drives and similar items are using about the
same as they always did, why are power supplies so much larger now?

I have not actually opened or used any computer newer than maybe 2010 or
so, but I had to help someone fix a Dell from around 2010 (dual core),
and that thing could have been used as an electric heater. The CPU alone
was hot enough to fry an egg on it, because one of the 4 fans died.

Maybe they have become more efficient since 2010, because that beast
must have sucked about 500w just to play simple solitare that came with
Win XP, (running XP Home) because that's about the only thing that
person did with their computer, yet it was pouring heat out of the case
like a 1500watt space heater. (and amazingly, was six (or more) times
slower than my year 2000 pentium made for Windows 2K).

One of the main reasons I dont want to upgrade (besides not liking
bloated operating systems), is because my electric bill is already very
high, and go on and off the computer daily between other work I do, so
it dont make much sense to turn it on and off 10 or more times each day.
So I just leave it on (except the monitor, which I shut off, when I'm
not using the computer).


There are multiple factors at work here. While the CPUs have
grown far more complex, transitor count has grown exponentially,
the feature size has continued to shrink, which is what makes
that possible. So, more complexity would mean more power, but
smaller transistors take less power. That same driving force
has lowered the voltage they operate at from 5V to 1V, which
obviously is a big factor. Overall, I think you'd find
that today's similar purpose CPUs probably take the same or less
power to run than their predecessors from 25 years ago. At the
same time, CPUs for special purposes, eg notebooks, have used
technological progress to tradeoff some compute power for much
lower power consumption.

I just checked the power supply on my 6 year old HP I7 based
PC and it's 460W. That's about the size that power supplies
have been all along. But I would think it's also possible
that while they need that as a theoretical peak, the I7
isn't using 4 cores and running anywhere near max power as
I sit here and browse the internet or do similar. I'd bet
that today's PC typically runs at lower power actually being
used than an old 386 system. Both probably had about the same
max size power supply though. PC and chip manufacturers have
been working to put in energy saving features, make them
energy star compatible, for decades now.

To save power, do you have the power settings set correctly
in control panel? I have mine set to put the display to
sleep and the CPU to sleep after 15 mins of no activity.
It wakes up in about 2 secs, by touching any key. That should
save you a reasonable amount of power, with no real downside.
  #10   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 10,730
Default Wattage of Todays Computers V/S Older ones

On 2/18/2016 10:53 AM, trader_4 wrote:

I just checked the power supply on my 6 year old HP I7 based
PC and it's 460W. That's about the size that power supplies
have been all along. But I would think it's also possible


The tower I use, came with about 180 watt
supply. When it went, I put in a 200. And
last week it went, again. The computer
store near me, Staples, had 350 watts as
the smallest size.

Power supplies seem to last 5 to 7 years,
for me. I'm okay with that.

--
..
Christopher A. Young
learn more about Jesus
.. www.lds.org
..
..


  #11   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 14,141
Default Wattage of Todays Computers V/S Older ones

On Thu, 18 Feb 2016 04:02:58 -0600, wrote:

Today's computers with their Dual or Quad core processors, and lots of
Ram, extended video cards, high wattage power supplies and so on, appear
to use a lot more input wattage and amps from an outlet, compared to the
common Pentium 1 series computers sold around the year 2000.

I am using such a computer, which was new in 2000. It has 512mb Ram
(Maximum it can use), came with a 100W power supply, (Which died and I
replaced with a 300W), and it has 2 fans, one in the power supply, and a
tiny one on the CPU.

These modern dual/quad core machines MUST require considerable more
wattage to run. Most have 3 or 4 fans and kick out a lot of heat
compared to the slight warmth I feel from my fan.

However, I'm looking for some actual facts about how much power is used
in this comparison. (A 2015/16 computer VS one from around 2000). I'd
like to find some actual data.

While computers are low energy users compared to many other appliances,
they are often left running around the clock, and that adds up.

One saving point is that the old CRT monitors used more power than the
modern flat screen ones which use either LED or Florescent to light the
screen.

Yet, even when the monitor is considered into the overall picture, I
suspect that modern computers still use considerably more power, just
based on the heat they kick out. (Of course in winter, that heat adds
some heat to the house, but in Summer is will require more cooling of
the home.


Most PC power supplies are rated a lot higher than the typical load
they handle. I don't remember the exact numbers but when I was
checking the components in my entertainment center a Dish 722
DVR/receiver used more power than a Compaq 600 P4 3.0mz machine
playing a video. (The TV is the "monitor" so that was not an issue)
They were both around 100 watts tho. You can figure this all out with
one of those Killawatt devices and they are getting pretty cheap these
days.
http://www.meritline.com/p3-internat...-p-124969.aspx
  #13   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,879
Default Wattage of Todays Computers V/S Older ones

On 2/18/2016 8:51 AM, HerHusband wrote:
The wattage rating of the power supply does not indicate the actual power
usage of the computer. Just like a 200A electrical panel doesn't mean
you're using all 200 amps, or a 200 watt stereo doesn't mean you're
cranking it to full power. That's just how much power it is capable of
producing.

Newer computers are actually a lot more energy efficient than the old
ones, despite the increase in speed.


+42

There has been a steady march towards smaller "device geometries".
I.e., the "fineness of detail" INSIDE the various "chips"; smaller
tends to mean faster and also REQUIRE lower operating VOLTAGES.
As a result, some of the instantaneous CURRENTS drawn by the CPU,
etc. are outrageous. And, one of the reasons you see "bad capacitors"
causing the premature "forced retirement" of many otherwise "good"
computers.

My old computer pumped out a lot of heat, enough that it kept my feet
warm under my desk and the heater in my office rarely needed to run. My
new computer puts off so little heat I can't even feel it.


Put 8 or 10 of them in a room and you'd be surprised how warm it
can get! :

I'm looking for some actual facts about how much power is used
in this comparison. (A 2015/16 computer VS one from around 2000).
I'd like to find some actual data.


I have my computer on a UPS so I can monitor my exact power usage. I have
an i7-4790K, 16GB RAM, 2 SSD's, 2 hard drives, an external hard drive,
cable modem, router, and an older LCD monitor connected to it. Under
normal use the whole system averages around 90 watts. It will jump up to
150 watts or so when processing video, but it drops to about 60 watts
when I turn off the monitor and the hard drives power down.


In my case, I tend to have lots of spindles. Often 15K (RPM) drives
so the drives themselves draw ~20W. Video cards tend to be the new
power hogs, though. Esp folks who are into gaming.

Each of my machines supports at least two monitors -- with my
primary workstations supporting *4* (though I only use 3).
So, "data presentation" tends to consume more than "data processing".

For what it's worth, I have a 650 watt Antec power supply, so you can see
my actual 90 watt usage is no where close to the power supply rating.

My previous computer (based on an i5-2500K) averaged around 120 watts
under normal use. I know that was an improvement over my old Pentium 4
computer, but I don't remember how much it used.

While computers are low energy users compared to many other appliances,
they are often left running around the clock, and that adds up.


Yep, I leave mine on 24/7 since it controls lighting in our house, and
records TV shows. At 60 watts overnight that's about the same as leaving
a light bulb on.


I have one (headless) box that runs 24/7/365. It provides key/core
services to the rest of the machines in the house (DNS, TFTP, NTP,
RDBMS, font server, etc.). But, I've been constantly pushing it to
lower power implementations. I currently run it on a Dell FX160.
As it's not even TRYING to update a video display (and there's
none attached!), now runs with a 5200RPM laptop drive (instead of
a hotter 3.5" drive), no expansion cards, etc. it's comfortable
at about 9W (1.6GHz). As a result, doesn't even have a case fan
(power supply is the size of a fat cigar)

I've debated replacing it with a Duo2 but really can't imagine what
the other core would *do*! (serving up RDBMS queries is the only
truly taxing service that it provides -- and those are infrequent!)

I suspect that modern computers still use considerably more power


These days graphics cards are the one of the biggest power users,
especially high end cards used by gamers. I don't play games so I use a
fanless Asus GTX750-DCSL-2GD5 video card. It uses very little power and
is absolutely silent.


Yeah, I have some graphics cards that are HUGE power hogs.
In my case, I want multiple heads, not a fast BLT'er. Often,
that pushes me to a more power-hungry card than I would like.

I also use Gelid FN-PX12-15 120mm fans for my CPU and case fans, and
they're virtually silent under normal use.


  #15   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 10,730
Default Wattage of Todays Computers V/S Older ones

On 2/18/2016 11:23 AM, philo wrote:

I don't have a watt meter so if I want to check power, I run the device
off my UPS and simply measure battery current and multiply by it's
voltage, then figure the device is using 90% of that


Didn't I remember you're an electrician, or
a big battery guy? No clamp on ammeter?

--
..
Christopher A. Young
learn more about Jesus
.. www.lds.org
..
..


  #16   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,377
Default Wattage of Todays Computers V/S Older ones

HerHusband writes:


The wattage rating of the power supply does not indicate the actual power
usage of the computer. Just like a 200A electrical panel doesn't mean
you're using all 200 amps, or a 200 watt stereo doesn't mean you're
cranking it to full power. That's just how much power it is capable of
producing.

Newer computers are actually a lot more energy efficient than the old
ones, despite the increase in speed.


Indeed. I have a 4yo Zotec box with a 4-core AMD cpu running four
virtual machines (two web servers, a DNS server and a mail exchanger).

Draws 11 watts at idle. 15 watts under load. Measured at the wall.

A considerable amount of engineering goes into designing the processors
to be more energy efficient. Most large datacenters are constrained
in three dimensions - space, power and cooling. To address space, they
want denser configurations (up to 100 servers per rack), but then they
run into cooling and power issues.

One of the more difficult areas to address in processor design is the
leakage current (i.e. power that is dissipated as heat even when the
processor is idle). As processor speeds increase and feature geometry
decreases, leakage current rises.

Processors designed for battery sources (in phones, etc) use slower
clockspeeds and aggressive voltage scaling, frequency scaling and
power gating to reduce the leakage current during idle periods.
  #17   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,377
Default Wattage of Todays Computers V/S Older ones

trader_4 writes:
On Thursday, February 18, 2016 at 7:02:26 AM UTC-5, wrote:



I just checked the power supply on my 6 year old HP I7 based
PC and it's 460W. That's about the size that power supplies
have been all along.


The key is the TDP for the cpu itself. TDP on Intel CPU's range
from circa 20 watts (for low-end desktop/embedded processors) to
circa 200 watts (for the high-core-count E-series server processors).

TDP is "Thermal Design Power".


https://en.wikipedia.org/wiki/Thermal_design_power

TDP's for various Intel processors are he

https://en.wikipedia.org/wiki/Haswel...roarchitecture)
  #18   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,879
Default Wattage of Todays Computers V/S Older ones

On 2/18/2016 10:08 AM, Scott Lurndal wrote:
HerHusband writes:


The wattage rating of the power supply does not indicate the actual power
usage of the computer. Just like a 200A electrical panel doesn't mean
you're using all 200 amps, or a 200 watt stereo doesn't mean you're
cranking it to full power. That's just how much power it is capable of
producing.

Newer computers are actually a lot more energy efficient than the old
ones, despite the increase in speed.


Indeed. I have a 4yo Zotec box with a 4-core AMD cpu running four
virtual machines (two web servers, a DNS server and a mail exchanger).

Draws 11 watts at idle. 15 watts under load. Measured at the wall.

A considerable amount of engineering goes into designing the processors
to be more energy efficient. Most large datacenters are constrained
in three dimensions - space, power and cooling. To address space, they
want denser configurations (up to 100 servers per rack), but then they
run into cooling and power issues.


It's not just the processor *chips* (ages ago, CPU was a chip; now
its used equally to refer to "the box that has the computer in it").
There have been lots of savings throughout the design of most machines.

E.g., my first PC dissipated ~1W in the keyboard! Each "boxer fan"
was another watt (don't forget the one in the power supply). Disks
tend to have bigger caches, narrower interfaces and higher transfer
rates -- so they can spin slower (less frictional losses). Everything
tries to "idle" -- instead of "doing nothing VERY FAST". etc.

  #19   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 662
Default Wattage of Todays Computers V/S Older ones

On Thursday, February 18, 2016 at 10:04:35 AM UTC-6, Stormin Mormon wrote:

Power supplies seem to last 5 to 7 years,
for me. I'm okay with that.


Over the yrs owning/fixing and selling, I've had 3 with bad power supplies. An HP with a 100W that just quit and one Dell from lightning (still putting out power but quirky) don't remember the other one at the moment.
My PC's have never had one fail in 18 yrs...only repaired ones.
  #20   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 6,586
Default Wattage of Todays Computers V/S Older ones

Stormin Mormon wrote:
On 2/18/2016 11:23 AM, philo wrote:

I don't have a watt meter so if I want to check power, I run the device
off my UPS and simply measure battery current and multiply by it's
voltage, then figure the device is using 90% of that


Didn't I remember you're an electrician, or
a big battery guy? No clamp on ammeter?

New ones have better high efficiency power supplies. Electronic
components are getting smaller and smaller increasing circuit density.
So of course new ones use energy considerable less. That points to
it also depends on product quality.


  #23   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,399
Default Wattage of Todays Computers V/S Older ones

On 02/18/2016 11:52 AM, Tony Hwang wrote:
Stormin Mormon wrote:
On 2/18/2016 11:23 AM, philo wrote:

I don't have a watt meter so if I want to check power, I run the device
off my UPS and simply measure battery current and multiply by it's
voltage, then figure the device is using 90% of that


Didn't I remember you're an electrician, or
a big battery guy? No clamp on ammeter?



Stormin Mormon: I have not seen you in years. After you went ballistic
over what history now refers to as "The 100 watt Light-bulb Incident".
I had to take you off my news feed. That was a long time ago, so what
the heck...you asked a question I might as well answer it.


Yes, I was an industrial battery and charger service engineer for 38
years and do have both AC and DC clamp-on meters, they are somewhat
accurate, but my D.C. ammeter is a precision instrument and I prefer it.


BTW: To those here not familiar with the "100 watt" incident, I'm sure
there would have been a big article in the New York Times written by the
reporter who keeps all that happens on Usenet, on the front pages.


New ones have better high efficiency power supplies. Electronic
components are getting smaller and smaller increasing circuit density.
So of course new ones use energy considerable less. That points to
it also depends on product quality.


  #24   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 14,141
Default Wattage of Todays Computers V/S Older ones

On Thu, 18 Feb 2016 12:23:57 -0600, philo wrote:

Yes, I was an industrial battery and charger service engineer for 38
years and do have both AC and DC clamp-on meters, they are somewhat
accurate, but my D.C. ammeter is a precision instrument and I prefer it.


I have an assortment of clamp ons too but for things like this the
kill a watt seems to do the best job since it actually averages the
load out over time but you can also get instant readings.
  #25   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 1,270
Default Wattage of Todays Computers V/S Older ones

I don't have a watt meter so if I want to check power, I run the
device off my UPS and simply measure battery current and multiply by
it's voltage, then figure the device is using 90% of that


I can simply change the display mode on my Cyberpower CP1500PFCLCD UPS to
see how many watts my system is using. It may not be absolutely accurate,
but close enough for my needs.

I have a Kill-A-Watt meter too, but haven't connected my computer to it
since I last upgraded the hardware.

Anthony Watson
www.watsondiy.com
www.mountainsoftware.com


  #26   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,415
Default Wattage of Todays Computers V/S Older ones

wrote:


Anyway I have been using those inexpensive mobo/cpu combos and they work
well and take very little power. My wife's machine has 32 gigs of RAM
and she can easily run several "heavy" apps at once with no bogging down.


Also it is surprising but a PC will draw more power when it is thinking hard.

Hook up a current meter or kill o watt etc to the PC then watch it. Grab
the top of a window and slide it around the screen and watch the current.

Mark


It's not surprising. Most of the power is capacitive switching. No switch,
no power. Run super fast switching, and it sucks power. The logic is not
transistor drawing high idle current, like really old stuff.

Greg
  #27   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 10,730
Default Wattage of Todays Computers V/S Older ones

On 2/18/2016 1:23 PM, philo wrote:
On 02/18/2016 11:52 AM, Tony Hwang wrote:
Stormin Mormon wrote:
Didn't I remember you're an electrician, or
a big battery guy? No clamp on ammeter?



Stormin Mormon: I have not seen you in years. After you went ballistic
over what history now refers to as "The 100 watt Light-bulb Incident".
I had to take you off my news feed. That was a long time ago,


What a shame. I barely remember the light bulb
thread. Some thing about a guy who wanted to
return light bulbs to the store, and I commented
on that. Just can't remember.

Well, if Philo is so unforgiving that he filtered
me (many years ago) about light bulbs, he deserves
what he gets. Or doesn't get. Guess I won't be
buying his book, when it is printed.

--
..
Christopher A. Young
learn more about Jesus
.. www.lds.org
..
..
  #28   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2
Default Wattage of Todays upset Philo

On 02/19/2016 06:14 AM, Stormin Mormon wrote:
Surely,
I have never attacked or deliberately tried to hurt you.
Yes, I've joked with Muggles and Uncle Monster, but that
was intended as humor and fun. And it appear to have
been taken as fun.


You might be dealing with one of those thin-skinned millennials.
Just look at them cross-eyed and they start crying and then run to human resources.
  #29   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 1,526
Default Wattage of Todays upset Philo

On Friday, February 19, 2016 at 8:14:28 AM UTC-5, Stormin Mormon wrote:
I remember Dr. Laura telling anecdote about two preists
who are walking along. There is a woman next to a river.


DUDE! Real men don't listen to Dr. Laura.

And those who do, don't admit it.


  #30   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 10,730
Default Wattage of Todays upset Philo

On 2/19/2016 11:21 AM, Ben A. Rownd wrote:
On 02/19/2016 06:14 AM, Stormin Mormon wrote:
Surely,
I have never attacked or deliberately tried to hurt you.
Yes, I've joked with Muggles and Uncle Monster, but that
was intended as humor and fun. And it appear to have
been taken as fun.


You might be dealing with one of those thin-skinned millennials.
Just look at them cross-eyed and they start crying and then run to human
resources.


Well, Philo did say he worked for 38 years.
Might be a millenial.

--
..
Christopher A. Young
learn more about Jesus
.. www.lds.org
..
..


  #31   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 10,730
Default Wattage of Todays upset Philo

On 2/19/2016 1:51 PM, TimR wrote:
On Friday, February 19, 2016 at 8:14:28 AM UTC-5, Stormin Mormon wrote:
I remember Dr. Laura telling anecdote about two preists
who are walking along. There is a woman next to a river.


DUDE! Real men don't listen to Dr. Laura.

And those who do, don't admit it.


Well, I had a heck of a lot of fun predicting
during the call what she would say. I was right,
most of the time.

--
..
Christopher A. Young
learn more about Jesus
.. www.lds.org
..
..
Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Todays top tip ARW UK diy 2 January 26th 16 02:20 PM
Todays Bollocking - I got one ARW UK diy 6 September 24th 13 07:20 PM
Todays top tip ARWadsworth UK diy 5 July 15th 12 03:35 PM
OT; Todays top H&S tip The Medway Handyman[_2_] UK diy 71 March 8th 10 08:02 PM
older computers anyone? Bill Noble[_2_] Metalworking 0 June 13th 09 07:01 AM


All times are GMT +1. The time now is 06:26 PM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 DIYbanter.
The comments are property of their posters.
 

About Us

"It's about DIY & home improvement"