View Single Post
  #6   Report Post  
James Sweet
 
Posts: n/a
Default


"Martin McCormick" wrote in message
...

Many fluorescent light fixtures use two tubes in series per
ballast. If one tube stops working, both are dark. I have a bunch of
4-foot 40-Watt tubes ranging in condition from probably good to most
likely bad. It occurred to me that it should be possible to test the
cathode emission on each tube by applying the 3-volt heater voltage to
the cathodes at each end and then determining what voltage it takes to
cause the gas to ionize. If one feeds the cathodes by either a
by filer transformer or two separate filament-type transformers, then
there should be no common connection between the ends of the tube.
One could use a current-limited AC source or even a current-limited DC
source to apply voltage between the ends of the tube. A good tube
should break down at around 175 to 200 volts while a bad tube will
need a higher voltage due to reduced cathode emission.

One should be able to vary the high voltage source so that the
voltage can be increased until breakdown occurs.

If the current limiting is substantial such as allowing only a
milliamp or two at 200 volts, then the tube would probably glow weakly
but the idea here is to determine breakdown voltage.

Does anybody see any reason why this shouldn't be a valid test method?

Thanks.
--



The phosphors normally wear out first, when the lumen output drops below
about 70% of new the tube is shot. Just pop each tube into a fixture with a
fairly new tube and toss out any that don't come up to reasonably close to
full brightness after a couple minutes.