View Single Post
  #18   Report Post  
w_tom
 
Posts: n/a
Default

In a system that consumes tens or hundreds of watts, then
1/2 watt for standby mode is the exact same thing as zero
power to those components. Standby does not extend equipment
life expectancy. Components suffer a 100% power increase
whether from standby mode or from cold restart.

It is nonsense to think that trivial current of standby mode
will save a component. Moreso, please tell us what the
current is through which parts so that parts last longer.
What are the numbers? And good luck trying.

As for time - computer maintains time of day in a circuit
completely unrelated to standby power. That design dates back
to the first IBM AT that had no standby option. Again, first
learn circuits before posting how they work.

Powering up from standby mode is no 'easier' on hardware
than powering from cold restart. The power up is different,
but just as 'stressful'. Stressful in quotes because the
stress from power up is mostly mythical - often promoted by
those who did not first learn how components work.

"Jerry G." wrote:
Standby mode, is to keep the computer sectons in the appliances active,
so that it can economicaly keep the setups. It also allows for the unit
to be able to keep track of the time of day, and economicaly maintain
the user parameters.

In the standby mode, some devices use a separate standby supply, or
have the main supply go to a low power output mode. Some appliances
pull less than 1/2 a Watt in the standby mode. Some larger appliances
are as high as 2 to 3 Watts in the standby mode.

In the appliances that share the operation of the main power supply for
both the standby and operation mode, tend to have their power supply
last longer. This is because the supply is still internaly active, but
there is very little load on it.

It is also a known fact, that devices that are left running all the
time, tend to have the components last longer. The sudden inrush of
current, and the temperature variations are what reduces the lifespan
of many electrical devices. For CRT type devices, the life span of the
CRT is reduced if the CRT is left running with beam current.

CRT's have a restricted life span, which is mainly determined from the
beam current. On a TV set, if you want to leave it left on, you can
turn down the brightness and contrast to minimum, and this would make
it have a reduced cathold current. The reduced cathold current would
increase the tube's lifespan. Since there is alway some cathold current
when there is high voltage, this will only be an extension on its
lifespan. The heater's rarely burn out in CRT's.

If you use a device every day, or even a few times a week, leaving it
in standby would allow for the unit to start up easier, and without the
user re-entering in all his paramters.

Using a chargeable battery to maintain the user parameters, would then
have the need for the power to be maintained at times to keep the
battery charged. Then after a time, these batteries would have to be
replaced, and thus end up as an added physical polution to the
environment.

In the standby mode, most of these appliances pull the same, or less
current than a wall clock. Infact, a VCR or TV set in standby generaly
will not pull enough current to get the power consumption meter from
the power company to even take a reading of the consumption.

If you want to, get an AC bar to feed your devices. You will have to
re-set it up every time you want to use it.