View Single Post
  #12   Report Post  
Posted to alt.electronics
PeterD PeterD is offline
external usenet poster
 
Posts: 1,247
Default VGA signal fading

On 31 May 2007 06:13:07 -0700, CoyoteBoy
wrote:

On 31 May, 13:27, PeterD wrote:
On Thu, 31 May 2007 01:02:40 +0100, "Coyoteboy"



wrote:

"Charles" wrote in message
...


I don't think you can gracefully fade a digital signal.


Standard VGA is analog, not digital.


A quick version would require a triple ganged pot, perhaps 150 ohms
each (one for each color signal). I guess three sliders with some sort
of homemade handle to gang 'em together might do just fine.


That makes sense.


I'm concerned about shorting the signals to ground. I'll experiment with a
cheap video card and see what is required to pull the signal low without a
full short. It'd be fairly easy to set this up with a set of digital pots
too, is my thinking, and enjoy the lack of mechanical failures (as a
version2).


That reply leads me to believe you have not a clue about what you are
doing. Best just buy the setup in this case. If you can't wire up a
set of potentiometers this is beyond your abilities.



No, I can manage potentiometers, just worried about damaging other
equipment should failure occur (shorting through the wiper while at
full scale for example) - hence my original suggestion of a seperate
signal amp which would be sacrificial in that instance. It was,
however, ~1am while i was writing it and it didnt come out quite as I
had planned it in my head so I can see why you thought that


I can't imagine how it would fail... High end of pot to the video
card, low end to ground, wiper to monitor/projector. Any failure there
would be rather unusual, to say the least.

That said, I've not seen any VGA outputs that won't survive some abuse
such as shorts. They are not high current devices... I don't think an
additional amplifier is worth the effort, afterall it could be the
failure point!