View Single Post
  #40   Report Post  
Posted to sci.electronics.repair
David David is offline
external usenet poster
 
Posts: 67
Default TVs compatible, from one continent to the next??

"William Sommerwerck" wrote in message
...

A DC synchronization aka "sync" pluse was included to
keep everything together so if signal got scrambled, the TV
would bring it back together quickly.


Actually, the sync pulses keep the horizontal and vertical
scanning in the
receiver at the same frequency and phase as the transmitted
signal.


Those rates were chosen because the studio lights were arc
lights and flashed on and off at the power line rate, so the TV
cameras had to be syncronized to them or you would get moving
black stripes across the screen.


This might have been a consideration, but the principal concern
was "hum
bars" in the receiver. Modern power supplies are sufficiently
well-filtered
that this isn't a concern.


The RCA system for compatible color TV (compatible with black
and white), used 1/4 of the color information based on the fact
that your eye only sees about that much.


Actually, it's more like 1/3.


The color information was encoded on a phase modulated 3.57MHz
subcarrier, which at the time was beyond the picture
information, but
still within the transmitted signal.


Actually, it was within the picture (luminance) information. NTSC
has always
had a potential video bandwidth of 4.2 MHz.


The original RCA system, alternated the phase of the carrier
every line,
so that it would fix itself if there was a transmssion or
syncrhonization
problem. To save money, the National Television Standards
Commitee
(NTSC) which chose the standard, dropped the alternating phase.


Actually, it was dropped because it didn't seem possible at the
time to
design a reasonably priced receiver that would take full
advantage of this
feature (in particular, the elimnation of the Hue control). Also,
the US
distribution system didn't have problems with non-linear phase,
so PAL had
little practical advantage.

Also, the original proposal used red and blue color-difference
signals,
rather than the more-efficient I and Q. The original NTSC
proposal was
virtually identical to PAL. (If you don't believe this, I have a
copy of
"Electronics" magazine that confirms it.)


The French used a different color encoding system called SECAM,
which was also based on the RCA system (1/4 color, 4.43mHz
color
carrier) but designed to be totally incompatible so that you
could not
watch French TV in England and vice versa.


SECAM stands for "sequential avec memoire".

SECAM was actually adopted because the French were idiots. They
wanted a
system that was relatively easy to record on videotape.
Unfortunately, it
made the receiver more-complex and expensive. A classic example
of lousy
engineering.

Actually there are more differences between PAL and NTSC color
encoding than the alternation of the phase:

1) NTSC I and Q color difference, PAL R-Y, B-Y
2) Different primaries, especially green. PAL had a smaller color
gamut.
3) Different color bandwidth for different colors. NTSC had 1.3
MHz for I and 0.5 MHz for Q. PAL was equal for R-Y and B-Y.
4) Excellent interleaving of chroma-luminance frequency
components which was largely destroyed by the phase alteration.

As a note, much of the advantage of points 2), 3) and 4) was lost
on early sets which just used 0.5 MHz bandwidth for decoding both
chroma components and bandwidth limiting the luminance signal to
minimize chroma-luma crosstalk. Also most sets did not use the
NTSC primary phosphors so a lot of the advantages of NTSC were
lost for a few decades. When integrated circuits became
available, dual bandwidth chroma decoders started appearing as
well as comb filters to separate the luminance and chroma
signals. More accurate phosphors were also gradually used in
sets. The result was a major improvement in picture quality with
the original 1953 broadcast standards. No such receiver
improvement was possible with the PAL system. Regarding VITS,
that was introduced, but very few sets used it.

David