View Single Post
  #261   Report Post  
Posted to alt.engineering.electrical,uk.d-i-y,sci.engr.lighting
Victor Roberts
 
Posts: n/a
Default UK question: ES light bulb better than bayonet?

On Tue, 13 Dec 2005 19:20:58 -0500, Sawney Beane
wrote:

Victor Roberts wrote:

I agree with your statement in the first paragraph - if we
are constrained to operate to 120 or 240 volts. However I
disagree with your later statement that lamp performance is
within a few percent of optimum - if we are allowed to
change the operating voltage. I do, however, agree that it
would not be worth if from an economic point of view to
develop a disposable voltage converter for incandescent
lamps.

Someone did try to put a diode in the base of each lamp to
reduce the RMS voltage from 120 volts to 84.9 volts. This
does increase efficacy at low cost, and if half the diodes
are inserted one way, while the other half are inserted with
opposite polarity, there should be no net effect on the
power grid or metering. However, this option opens the door
for crafty people to select lamps with all the same polarity
and hence get part of their power for free, so this idea was
dropped, I believe under pressure from the power companies.

Traditionally, AC has been thought superior to DC operation of incandescent
lamps because so-called filament "notching" can occurr on dc plus if there
is any moisture present in the base/seal area, metal can electrolytically
move from one line to another and cause seal or line failure.


So, life with AC is better than with DC, but the switch from
one to the other does not effect efficacy. In fact, the RMS
voltage of an AC waveform is defined as the DC voltage that
gives the same heating power - and we all know that
incandescent lamps are just heaters that happen to generate
a bit of light.

Years ago I used a cadmium sulfide photocell to check flicker.
IIRC, I put the photocell in series with a resistor, powered the
circuit with a flashlight cell, and monitored voltage across the
photocell. IIRC, the wavelength/response profile for the photocell
was similar to that of the human eye.

I found no flicker from sunshine


You probably didn't wait long enough :-) The sun is known to
be a variable star. Data clearly shows reduction in output
near 1550 and another near 1700. Plus, there is the 11-year
sunspot cycle that has minor influence on the output of the
sun.

and more flicker from an
incandescent lamp than from a fluorescent fixture with a magnetic ballast.

Could that be true? It indicates that the filament temperature of
a bulb run on 60 Hz AC heats and cools significantly 120 times a
second. With DC, wouldn't filament temperature stay constant?
Wouldn't less peak voltage be required for a given amount of light?


While typical incandescent lamps used in the US are
considered to be flicker-free there is obviously some
variation to the visible light output. Lower mass filaments
will obviously have more flicker than higher mass filaments,
so the amount of flicker depends upon the lamp power, the
operating voltage and the operating frequency plus other
factors.

A 100-watt 120-volt lamp will have less flicker than a
100-watt 240-volt lamp operating at the same frequency. A
40-watt 240-volt lamp will have more flicker, and if run on
50 Hz instead of 60 Hz it will have even more. I believe
that early hydroelectric power systems in the US, such as
Niagara Falls, generated power at 25 Hz and there was
significant visible flicker from (probably low power)
incandescent lamps used at the time.

One reference I found

http://www.epri-peac.com/tutorials/brf36tut.html

gives the thermal time constant of a 120-volt incandescent
lamp with a "typical" but unspecified power as 28
milliseconds and the thermal time constant of a 230-volt
lamp of the same power as 19 milliseconds. However, this
data is not useful by itself. Total radiation varies as T^4
and visible radiation, which is just the short wavelength
end of the SPD, will vary even faster with filament
temperature, so thermal time constants are of little value
unless the temperature is used to calculate visible
radiation. Note that the data at the link above shows that
even at 20 Hz the flicker perception of the lamps they
tested was far lower than the flicker perception of
fluorescent lamps operated at the same frequency.

What was the operating voltage and power of the lamp you
tested?

If there is significant flicker and temperatue change with AC,
wouldn't it be worse with a half-wave rectifier?


Much worse. The idea was proposed for a 100-watt, 85-volt
lamp operating at 60 Hz, which would reduce flicker a bit,
but there may still be flicker problems.

--
Vic Roberts
http://www.RobertsResearchInc.com
To reply via e-mail:
replace xxx with vdr in the Reply to: address
or use e-mail address listed at the Web site.

This information is provided for educational purposes only.
It may not be used in any publication or posted on any Web
site without written permission.