View Single Post
  #8   Report Post  
Posted to sci.electronics.repair
Andrew Rossmann
 
Posts: n/a
Default TV picture from DVD fades in and out??

[This followup was posted to sci.electronics.repair and a copy was sent
to the cited author.]

In article ,
says...

The TV in out teen agers's playroom has only an RF input and the kids
have a commodity grade VCR and an entry level DVD hooked up to it.

The RF output from the VCR feeds the input of the TV, and the DVD's
video and audio outputs are connected to the corrosponding inputs on the
VCR.

All's well when playing video tapes on the VCR, but when playing a DVD
the TV image "fades" or changes contrast slowly with a cycle time of
about 20 seconds. Not badly enough to make you miss anything on the
screen, but it's noticably annoying.

I wrote it off as being some weird kind of incompatibility anomaly, and
since I don't have to watch DVDs in that room I didn't bother trying to
eliminate the problem.

Yesterday I was at a friend's home who had a similar setup, but with
different brands of equipment and his TV image faded in an out the same
as ours when he tried to show me a scene on a DVD.

That got me wondering if that fading effect is a "well known problem"
and if it is, what's the easiest way to get things working "normally".


As others have mentioned, this is an artifact of the Macrovision copy
protection. Not all VCR's will do this when in 'pass-thru' mode. My JVC
will work fine, until you hit RECORD.

There are some RF modulators out there that also have an A/B switch.
You can feed the DVD through the modulator, with the VCR/antenna being
switched. Alternately, just get a modulator and an RF switch. Both are
real cheap. You just switch between the two inputs.

--
If there is a no_junk in my address, please REMOVE it before replying!
All junk mail senders will be prosecuted to the fullest extent of the
law!!
http://home.att.net/~andyross