View Single Post
  #3   Report Post  
Posted to uk.d-i-y
fred fred is offline
external usenet poster
 
Posts: 3,703
Default Fading a light source in and out automatically.

In article , Dave Plowman (News)
writes
I've got a low voltage PWM unit which is controlled by a DC voltage. Can
be used for motor or light control etc.

In this case I want to use it to drive high power LEDs. Which it does just
fine.

What I want to do is have it ramp up to max when a switch is made, and
ramp down to minimum when it's opened. I've experimented with RC circuits
which do sort of work - but don't give a linear fade in and fade out.
Any clever ideas? Preferably using just one cheap IC that I'll have in my
box...;-)

It would also be nice to arrange things so it takes no current when the
switch is off. Although I can get round this.

An integrator is what you're looking for, here's an op-amp based
circuit:

(Fixed width font reqd)
___ ___
| |
| |
| |R1 | | R3 || C1
| | | | ------||---------
| | | | | || |
|_| |_| | |
| | | |\ |
| _______ | | | \ |
----_______|--|-----|-----|- \ |
| | | \_____|________ Vcontrol
| R2 | | /
| |-----------|+ /
| | | /
| | |/
PB1 | | | R4
| o--| | |
--| | |
| o--| |_|
| |
| |
_|_ _|_


The ramp time delay is R2*C1 seconds.

If you chose a single supply fet input op amp then R3,R4 can be 1M, R1
roughly R2/10 to avoid skewing the delay.

Off state current drain should be in the uA range.
--
fred
BBC3, ITV2/3/4, channels going to the DOGs