View Single Post
  #10   Report Post  
Posted to uk.d-i-y
John Rumm John Rumm is offline
external usenet poster
 
Posts: 25,191
Default Need some illumination

wrote:

So 0.625% V_drop gets us:
8% longer life, or 1080 hours
1.2% reduction in efficacy

So 8% for that, plus some more for soft starting,


for mains halogens yes, for other lamps no significant gain. So that
would gain us what, 25% of 37p is 9p. But 0p for other types.


TLC do GU10 4000 hour mains halogens at 3.85+VAT. So the costs are a bit
more on decent bulbs. (some of the elcheapo supermarket ones you will
indeed be lucky to get 1k hours, but they are a false economy anyway).

A 25% lifespan increase on a 4000 hour bulb is also not insignificant.

I think now the remaining issue is that of explaining what we both
agree on, or wording for the article. It should not be too hard in
principle to come up with something, even if we go back and forth
2 or 3 times.

A very simple way to get that ball rolling could be to say that
changing to a dimmer will in some situations save energy, and in
some use more, and give a link to this thread for more info.


Yup, works for me.

I think the repetition of relative energy efficiency came about
because of the layout more than anything else. The layout doesnt
seem to work that well.


I think that will be helped by shifting the detail of the energy saving
aspects to a dedicated energy saving article, so as to leave this one
focussed on practical stuff to do with lighting level control etc.

Its no use saying one can trim a bit off output and no-one will
notice, the reality is that many will stay with the same bulbs and a


Chances are you won't. Your eyes have an exponential response to
brightness and a built in brightness control mechanism. You may notice a
step change of a couple of volts if it happens when a lamp is on.
However as a gradual change or in discrete instalments you are unlikely
to see a change.


Those points I fully agree with, but none of them change anything.
IOW I wasnt writing in ignorance of the above.


As you already said, we use filament lamps (and CFLs in fact) well past
their hours of brightest output. I have a hard time believing that a
sizeable proportion of people replace them at any time other than when
they fail.


I'm not clear how you get to there from anything I've said or youve
said, and so far its only each other's statements we've been
discussing here.


Well if you follow the argument about reducing energy efficiency (in
lumens/watt) costing more to run. It can only cost more to run if you
somehow compensate for the reduction in light output. There are only
really two ways you could compensate for a dimming bulb - replace it or
supplement with others. I think most people just accept it. If they
notice it at all it is at the point where they replace it and think "oh,
that is a bit brighter"

(anecdotal story that is a good example of that. My bathroom had the
sunken spotlights that each took a 60W R80 bulb. As far as I could tell
the bulbs were ordinary Sylvana R80 ES lamps. However they came with the
house (hence no idea what their installed hours were), and all three
much to my amazement carried on for a further 10 years! Only when one
finally failed and I replaced it did I notice that the new one was
noticeably brighter than the remaining two - a far more marked
difference than in any of the other three spot fittings we have that
typically have much more conservative bulb lives. However prior to a
bulb failing, I never felt that the lights were dim or in any way
inadequate).



The notion
that you can trim the lighting level and no-one will do anything about
it is a false one, which I could explain more if necessary.


You are flogging a dead horse here, so don't bother.


Far from it, this is standard stuff.

Lets tackle it with a reductio ad absurdam.

You say you wont notice a 0.625% Vdrop. (I agree with that much.)
Now, if thats true, if people dont notice going from 100% to 99.375%
, they also wont notice the same step change applied again, ie
from .625% down to 1.25% down. Its exactly the same magnitude
of change.


As an incremental change this is probably true. As you go further down
the volts, the loss of lumens per volt will increase though - so at some
point you would notice.

We can confirm this with an experiment, and I think we can run
this one as a thought experiment. We lead people into a room lit at
100% or 99.375%, let them see it several times, without knowing
which are which, and ask them to tell us which times were 100%
and which were down 0.625%. I expect we'll agree that people
wont pick it up with any notieable degree of reliability.


yup

So, lets run ad absurdam with your hypothesis now. You make a
small step change, no-one notices. Next day you make another
small step change, again noone notices the step change. You
keep doing this day by day, and no-one picks up any of the step
changes. But according to me, somewhere along the line they say
hey, the lighting isnt very good, we need to up the bulbs in here.
According to your hypothesis, the lights will go completely out and
they still havent noticed. I know thats not what you said, but its
what follows from it by applying your argument repeatedly.


Well not quite. I tend to find that you don't reach the point of "hey,
the lighting isnt very good" within a typical or even extended lifetime
of the bulb. Note my extreme example above. More conventionally, had I
say bought a 4000 hour halogen, and actually got 5000 hours out of it, I
don't expect the lower performance in those last 1000 hours would be a
problem.

Its not hard to see that as we keep doing those step changes,
most people will uprate their bulbs at some point along the line.
Where it happens is spread out. The more you drop it by, the more
people uprate. The less, the less.


In theory yes, but I don't see this happening. Which suggests to me that
they are not falling far enough down the efficiency curve to make it an
issue.

(also bear in mind that most filament lamps are flash stressed at the
time of manufacture to help ensure they fail before they get too far
into their dotage)

People have a target lighting level, what looks ok to them, which
varies from person to person, and they will pick whatever lighting
reaches this level. This is why a lot wont change anything if you
drop light output 10% - but some will. The more you drop it, the
more people uprate.


Yup, agreed. So one pertinent question is then, for those who do
something with a lamp that causes it to last much longer than its
"normal" life, how many are going to reach a point where its output is
no longer acceptable due to ageing of the bulb?

Lets take it the other way to llustrate the deal - and this is one
experiment I have done (though only with 2 subjects, not lots). Lets


changed from 100w to 60w (me) and 75w (other person)). So
instead of a 100w lamp I now had a 60w rated lamp running at IIRC
66w, and producing similar light output.

Lamp life is shortened, but the extra cost this incurs is less than
the money saved on running energy.


So long as you ignore the energy costs of procuring, transporting,
fitting, and disposing of the extra bulbs and also ignore the costs in
your time, this would probably be more attractive.

If we want to improve energy use we would need to _increase_ GLS
lamp voltage, which is easily done, and then replace 100w lamps
with 75w or 60w ones. Its not hard to do, the downside of reduced
lamp life makes it not worth bothering for most people. There are


Indeed, I am sure most people can do far more productive things with
their time.

still significant saving there, as the lamp purchase cost delta is
much smaller than the run cost delta.


Yup, but less so when you add back in all the overhead costs (money and
energy) in keeping up with the increased re-lamping rate.


Yup, I would go for that. In fact an much bigger scope article on energy
saving in the home might be useful. Helps put these things in context.
It all well and good claiming you can save £1000 over 25 years with your
choice of lighting, but may well pale into insignificance if you are
loosing that much per year due to drafts and poor insulation.


Absolutely. I know theres a link on the wiki to such a thread,
no-one's written it up so far.


Perhaps a "Energy saving" article would be good - but I can see that
creating more heat than light on occasion!


--
Cheers,

John.

/================================================== ===============\
| Internode Ltd -
http://www.internode.co.uk |
|-----------------------------------------------------------------|
| John Rumm - john(at)internode(dot)co(dot)uk |
\================================================= ================/