View Single Post
  #197   Report Post  
Posted to alt.building.construction,alt.home.repair,misc.consumers.frugal-living
Don Klipstein Don Klipstein is offline
external usenet poster
 
Posts: 1,431
Default Doorbell always uses electricity!

In , Jim Redelfs
wrote in part towards the end in response to a prior posting of mine:

As to your point that LIGHT becomes heat, I wonder how MUCH light would
be required to heat a given space? Velly interesting...


Giving efficiency of lamps or for that matter almost every other
electrical load in a house approaching 100% for converting the electrical
energy consumed to heat, I would not be too concerned with how much of the
heat materializes after spending a few or several nanoseconds being in the
form of light along a path of electricity consumption becoming heat. I
would just consider the watts consumed by the electrical load and multiply
by 3.4 to get BTU/hour if that is what you want.

Should you want something more academic, as in watts or BTU/hour in a
given quantity of light:

The most common "official definition" (my words) of "visible light" is
"electromagnetic radiation" having wavelegths in the 400-700 nm range.

1 watt of such from most light sources used to illuminate homes has
about 240-300 lumens. A lumen is amount of photometric output that
illuminates 1 square foot to extent of 1 footcandle, or 1 square meter to
extent of 1 lux. A "USA-usual" 100 watt 120V "big-3 brand" lightbulb with
rated life expectancy of 750 hours and coiled-coil filament produces
1670-1750 lumens, and about 6.6 watts, maybe 6.7 watts of radiation of
wavelengths 400-700 nm.

Plenty of other "white light sources" produce radiation having roughly
240-320 lumens per watt of radiation of wavelength 240-320 nm, meaning 1
watt or 3.4 BTU/hour from 400-700-nm-"light" amounting to 240-320 lumens.

Keep in mind that along with that 6.6-6.7 watts of radiation in the
"official visible light range", the above 100W lightbulb produces plenty
of infrared. Something like around roughly ballpark 50 watts of infrared
radiated by the filament passes through the glass bulb. The glass bulb
typically radiates a few watts more infrared, maybe as much as 10 or a
dozen or so.
Since a 100 watt 120V lightbulb can produce something like 75 watts
combined of infrared and visible light with a trace of ultraviolet that is
mostly in the "non-tanning portion of UVA as in 'Blacklight Range' ",
when light output is 1710 lumens, each lumen of light escaping the fixture
*may* be associated with close to .044 watt of radiation that becomes heat
in the home after exiting the lightbulb in form of radiation (in addition
to heat from the fixture). .044 watt is about .15 BTU/hour.

But also since energy going into a lamp within a home experiences
well-approaching 100% efficiency of producing heat within the home, I
suggest that room or building heating effects result mainly from power
consumption of the light source rather than photometrics. At the usual
rate of about 3.4 BTU/hour per watt.

The biggest problem I hear now is as to how much home heating by lamps
is achieved at ceilings and how much of that is off-target by producing
heat above where it is wanted/needed!

- Don Klipstein )