This is a silly overgeneralization. There can be a very noticeable
difference between S and composite video. It depends on the
source/destination combination.
As for the original question, I am not familiar with the output of your
video card but I suspect that you may be seeing the effect of the comb
filter combined with scan artifacts. The card may not be properly
interlacing for an NTSC output. The Mits could also have a problem. It
should automatically select S-video when the you connect it and switch to
that input. I don't recall having any menu options that affect this on these
models.
Leonard Caillouet
"red" wrote in message
...
There isn't a huge difference between Svideo and Composite. Specialy since
what is coming in from the cable like isn't even SVIDEO quality. I don't
really see your problem besides switching the TV inputs, but try a
universal
remote.
"JURB6006" wrote in message
...
Hi folks;
Hope yall had a nice Eve and wish you the best holiday.
If Santa is welcome at your house I hope he needs a Tractor-trailer to
bring
you good stuff (pretend to be sleeping then)
Now that things have settled down . . . .
One of the things I did today was to go hook up a new PC system,
integrated
with the A/V stuff. Cable signal also goes to the PC and we ran back
SVHS
and
L/R to the TV (Mits. 35MX1). I thought the TVO portion of the ATI All in
wonder
wasn't working. Switching to a regular input worked, I used the right
audio
cable for the composite video.
Now there is another issue. We could just run composite but this is no
good,
the signal is slightly off standard (it is) and the Mitses COMB filter
doesn't
work well with it.
Some frequency is set different, when a saturated color comes on, the
dots
are
stationary. You see them on alot of NTSC sets, and with one line COMB
filters
the first line of a saturated block displays a serration, or if the COMB
isn't
working correctly a hatch pattern throughout the colored block or
object.
However, on a "legal" NTSC signal this pattern appears to move. This is
of
course because the designers of the system chose the frequencies used so
they
would cause cause "interlace". On the ATI card output, the "hanging
dots"
are
stationary on a stationary image. The chroma frequency must be shifted
slightly, and I believe intentionally, to simplify encoding (ATI Radeon
7500
All in Wonder w/ 64MB, remote), thus making the SVHS cable necessary.
We lack the original remote to the set and can't seem to get it into the
SVHS
mode. Being a tech. I think I can force it into the SVHS mode no matter
what
system control says, but that is some work, especially in this
installation.
Note that as a tech I absolutely refuse to try to align the COMB filter
to
this
source. This would degrade all normal inputs.
Also, I wonder, will a VCR record that non-standard signal ? Probably I
think,
but I wonder just how much it's degraded. Luminance will be filtering
out
a
3.5???? Mhz signal, which might not bother it too much. The chroma noise
reduction scheme will probably be quite compromised though, it might
even
record in black and white.
Recording VHS from this source is not the important point, image quality
is,
this is way worse than it should be, and the COMB filter is working, but
when
the frequency is off. . . . . I guess we need to get it switched to the
SVHS
input.
Advice, comment, suggestion ?
Thanx for anything on this and thanx for even reading this.
Merry, happy, joyful
Christmas, Chanukah, Kwaanza (sorry if I spelled it wrong), Ceremony of
the
Rebirth, whatever you have,I wish you a good one.
JURB