View Single Post
  #571   Report Post  
Posted to sci.electronics.repair,uk.rec.audio,uk.tech.broadcast
David Looser David Looser is offline
external usenet poster
 
Posts: 97
Default Audio Precision System One Dual Domani Measuirement Systems

"Arny Krueger" wrote in message
...

"William Sommerwerck" wrote in message
...
BTW AM sound was always used with +ve vision modulation. I'm not sure

that
there was a killer reason why FM could not have been used with +ve
vision
modulation, but intercarrier reception (the cheap'n'easy way to receive

FM
sound with TV) wouldn't work with +ve vision modulation unless there
was
significant carrier amplitude remaining at the sync tips. Normally with

+ve
vision modulation the carrier amplitude at sync tips was nominally
zero.


Early US TV sets used separate video and audio IFs -- intercarrier had
not
been thought of at that point.

My understanding is that "inverted" polarity was used to minimize the
effects of noise bursts on the sync pulses.


That's a good part of it. The net purpose of inverted polarity was to
improve subjective dynamic range. White flecks on a grey background are
far less obvious than black ones.

Umm..No. You've both got it the wrong way round. With -ve polarity sync
pulses are more affected by noise bursts than with +ve polarity. And white
flecks are far more obvious than black. Part of the reason is that impulse
interference could greatly exceed the 100% vision carrier level, saturating
the video amplifier and, with +ve modulation, the CRT.

This was why US TVs, where -ve modulation was used from the beginning,
employed flywheel sync very early on, whilst UK TVs didn't. On the other
hand UK TVs needed peak-white limiters to prevent the CRT defocusing on to
the "whiter-than-white" interference specs.

The real benefit of -ve modulation was AGC. With -ve modulation sync tips
correspond to 100% modulation and make an easy source for the AGC bias. With
+ve modulation sync tips are at zero carrier which obviously is useless for
AGC. Instead the back-porch has to be used and many different weird and
wonderful circuits were devised to "gate out" the signal voltage during the
back porch. Due to the need to keep costs down manufacturers increasingly
turned to "mean-level AGC" in which the video signal itself was simply
low-pass filtered to form the AGC bias. This lead to receiver gain being
varied by the video content, so the black on low-key scenes was boosted
whilst the whites in high-key scenes were reduced leading to a general
greyness to everything. To me it looked awful but as the Great British
Public kept buying these sets (and they were cheaper to build) mean-level
AGC became the norm for B&W UK domestic TV receivers. One great advantage of
colour was that mean-level AGC could not be used, to give correct colour
values colour sets *had* to display a picture with a stable black-level.

David.