Electronics Repair (sci.electronics.repair) Discussion of repairing electronic equipment. Topics include requests for assistance, where to obtain servicing information and parts, techniques for diagnosis and repair, and annecdotes about success, failures and problems.

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #42   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default NTSC versus PAL

I understand that prior to the expiry of the Telefunken PAL patent,
Sony Trinitron sets for the PAL market actually threw away the
chrominance signal on alternate scan lines, thus landing themselves
back in NTSC territory. Those sets had a tint control, and I know from
personal experience that they produced a perfectly satisfactory result
(I only learnt the other day why they had a tint control).


Depends on what you mean by "satisfactory". Passable, maybe.


When you discuss something at length, you become aware of those things you
thought you understood, but didn't. (Well, I do, anyway.)

I'd always read that one could construct a PAL receiver in such a way that
eliminated the need for a manual hue control. I never questioned this, but
now it makes little sense.

There are two reasons for having a manual hue control:

The user can adjust the color rendition to their personal (and usually

incorrect) taste. *
The user can correct for incorrect burst phase.


That seems to be "it". As we've seen, these errors can be corrected by
adjusting the hue control, whereas the other error -- differential phase
shift -- cannot be so-corrected, because the timing errors are not linear.

Here's where I get confused. The line-to-line polarity reversal ** causes
the differential phase errors to be equal and opposite, and thus cancel out
when added (at the cost of desaturation -- but that's another issue).

However... If the burst phase is wrong, then there is no cancellation of
errors, because there are no "errors" /in the signal itself/. (Right? (???))
Therefore, I don't see how line averaging can be used to eliminate the need
for a manual hue control.

If anyone knows of a reference with a non-tautological explanation, I'd
appreciate a pointer to it. Thanks.

* Left to their own devices, the average user generally sets the color for
greenish skin tones. I wonder if Vulcan viewers tended towards a pinkish
error.

** It's actually line-to-line+2, because the image is interlaced.


  #43   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 43,017
Default NTSC versus PAL

In article ,
William Sommerwerck wrote:
I understand that prior to the expiry of the Telefunken PAL patent,
Sony Trinitron sets for the PAL market actually threw away the
chrominance signal on alternate scan lines, thus landing themselves
back in NTSC territory. Those sets had a tint control, and I know from
personal experience that they produced a perfectly satisfactory result
(I only learnt the other day why they had a tint control).


Depends on what you mean by "satisfactory". Passable, maybe.


When you discuss something at length, you become aware of those things
you thought you understood, but didn't. (Well, I do, anyway.)


I'd always read that one could construct a PAL receiver in such a way
that eliminated the need for a manual hue control. I never questioned
this, but now it makes little sense.


I've never seen a set designed for the PAL market with a hue control. Only
ones modified from a basically NTSC design.

There are two reasons for having a manual hue control:


The user can adjust the color rendition to their personal (and
usually incorrect) taste. *
The user can correct for incorrect burst phase.


That seems to be "it". As we've seen, these errors can be corrected by
adjusting the hue control, whereas the other error -- differential phase
shift -- cannot be so-corrected, because the timing errors are not
linear.


You simply don't get hue errors on PAL sets - unless the grey scale is set
incorrectly. Of course some sets also used the incorrect phosphors to
provide a brighter picture - but a hue control couldn't compensate for
that.

--
*I'm already visualizing the duct tape over your mouth

Dave Plowman London SW
To e-mail, change noise into sound.
  #44   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 246
Default NTSC versus PAL

On 5/04/2010 10:56 PM, William Sommerwerck wrote:
I understand that prior to the expiry of the Telefunken PAL patent,
Sony Trinitron sets for the PAL market actually threw away the
chrominance signal on alternate scan lines, thus landing themselves
back in NTSC territory. Those sets had a tint control, and I know from
personal experience that they produced a perfectly satisfactory result
(I only learnt the other day why they had a tint control).


Depends on what you mean by "satisfactory". Passable, maybe.


When you discuss something at length, you become aware of those things you
thought you understood, but didn't. (Well, I do, anyway.)

I'd always read that one could construct a PAL receiver in such a way that
eliminated the need for a manual hue control. I never questioned this, but
now it makes little sense.

There are two reasons for having a manual hue control:

The user can adjust the color rendition to their personal (and usually

incorrect) taste. *
The user can correct for incorrect burst phase.


That seems to be "it". As we've seen, these errors can be corrected by
adjusting the hue control, whereas the other error -- differential phase
shift -- cannot be so-corrected, because the timing errors are not linear.

Here's where I get confused. The line-to-line polarity reversal ** causes
the differential phase errors to be equal and opposite, and thus cancel out
when added (at the cost of desaturation -- but that's another issue).

However... If the burst phase is wrong, then there is no cancellation of
errors, because there are no "errors" /in the signal itself/. (Right? (???))
Therefore, I don't see how line averaging can be used to eliminate the need
for a manual hue control.


Think of the chroma signal as a vector with its y coordinate equal the
red difference component, and the x coordinate equal to the blue
difference component. A phase error rotates that vector about the z
axis. Effectively, the blue difference component receives a bit of the
red difference component, and vice versa.

On alternate lines the phase of the red difference component *only* is
inverted. In our view, this has the effect of reflecting the vector in
the x axis - what was a positive y value becomes negative.

The same phase error causes this vector to rotate in the same direction
about the z axis, but because of the reflection, the mixing of the
components has the opposite sign.

If you then negate the resulting red difference component of the second
line, and average with the red difference component of the first line,
the parts received from the blue difference component cancel out,
leaving a red different component that equals the original, multiplied
by the cosine of the phase error. The same applies to the blue
component. The result is that the hues are correct, but not as saturated
as they shoud have been.

Sylvia.
  #45   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default NTSC versus PAL

However... If the burst phase is wrong, then there is no cancellation of
errors, because there are no "errors" /in the signal itself/. (Right?

(???))
Therefore, I don't see how line averaging can be used to eliminate the

need
for a manual hue control.


Think of the chroma signal as a vector with its y coordinate equal the
red difference component, and the x coordinate equal to the blue
difference component. A phase error rotates that vector about the z
axis. Effectively, the blue difference component receives a bit of the
red difference component, and vice versa.


On alternate lines the phase of the red difference component *only* is
inverted. In our view, this has the effect of reflecting the vector in
the x axis - what was a positive y value becomes negative.


The same phase error causes this vector to rotate in the same direction
about the z axis, but because of the reflection, the mixing of the
components has the opposite sign.


If you then negate the resulting red difference component of the second
line, and average with the red difference component of the first line,
the parts received from the blue difference component cancel out,
leaving a red different component that equals the original, multiplied
by the cosine of the phase error. The same applies to the blue
component. The result is that the hues are correct, but not as saturated
as they shoud have been.


No argument. That's always been my understanding. But...

If the burst phase gets screwed up somewhere along the line, no amount of
line averaging will fix the problem, because there's nothing "wrong" with
the subcarrier to fix.

Granted, this problem hardly ever happens. But the argument that a fully
implemented PAL set is inherently immune to color errors is hard for me to
swallow.




  #46   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 57
Default NTSC versus PAL

Sylvia Else wrote:
If you then negate the resulting red difference component of the second
line, and average with the red difference component of the first line,
the parts received from the blue difference component cancel out,
leaving a red different component that equals the original, multiplied
by the cosine of the phase error. The same applies to the blue
component. The result is that the hues are correct, but not as saturated
as they shoud have been.


Since PAL TV sets have a saturation (color level) control, isn't that
a "non-problem". If it matters, you just adjust it to compensate.

My experience is that people set the color saturation too high, if I hold
my hand up to the screen my skin looks pale in comparison to everyone
on it.

Geoff.

--
Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM
New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or
understanding, as in he has a sub-wikipedia understanding of the situation.
i.e possessing less facts or information than can be found in the Wikipedia.
  #47   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 43,017
Default NTSC versus PAL

In article ,
Geoffrey S. Mendelson wrote:
My experience is that people set the color saturation too high, if I hold
my hand up to the screen my skin looks pale in comparison to everyone
on it.


Especially CSI. ;-)

--
*If I throw a stick, will you leave?

Dave Plowman London SW
To e-mail, change noise into sound.
  #48   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 57
Default NTSC versus PAL

Dave Plowman (News) wrote:
In article ,
Geoffrey S. Mendelson wrote:
My experience is that people set the color saturation too high, if I hold
my hand up to the screen my skin looks pale in comparison to everyone
on it.


Especially CSI. ;-)


That's funny, I was thinking of last Thursday night's episode of CSI
when I wrote that.

Geoff.
--
Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM
New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or
understanding, as in he has a sub-wikipedia understanding of the situation.
i.e possessing less facts or information than can be found in the Wikipedia.
  #49   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 246
Default NTSC versus PAL

On 6/04/2010 12:53 AM, William Sommerwerck wrote:
However... If the burst phase is wrong, then there is no cancellation of
errors, because there are no "errors" /in the signal itself/. (Right?

(???))
Therefore, I don't see how line averaging can be used to eliminate the

need
for a manual hue control.


Think of the chroma signal as a vector with its y coordinate equal the
red difference component, and the x coordinate equal to the blue
difference component. A phase error rotates that vector about the z
axis. Effectively, the blue difference component receives a bit of the
red difference component, and vice versa.


On alternate lines the phase of the red difference component *only* is
inverted. In our view, this has the effect of reflecting the vector in
the x axis - what was a positive y value becomes negative.


The same phase error causes this vector to rotate in the same direction
about the z axis, but because of the reflection, the mixing of the
components has the opposite sign.


If you then negate the resulting red difference component of the second
line, and average with the red difference component of the first line,
the parts received from the blue difference component cancel out,
leaving a red different component that equals the original, multiplied
by the cosine of the phase error. The same applies to the blue
component. The result is that the hues are correct, but not as saturated
as they shoud have been.


No argument. That's always been my understanding. But...

If the burst phase gets screwed up somewhere along the line, no amount of
line averaging will fix the problem, because there's nothing "wrong" with
the subcarrier to fix.


If the burst has a random phase relationship to the colour subcarrier on
each line, then my analysis falls apart because the vectors would have
random orientations. In such a situation a PAL receiver would do no
better than NTSC, and they'd both perform awfully.

If the burst just has a fixed phase offset from the true colour
subcarrier, then the averaging will work.

Indeed it will work if the colour subcarrier drifts in a consistent way
relative to the burst - or if the receiver's oscillator similarly
drifts. The effect of such a drift on an NSTC picture would be a
variation of tint from left to right. However, a tint control wouldn't
be able to address that problem - it would simply move the horizontal
position on the screen where the colours are accurate - suggesting that
it doesn't occur in practice except in equipment that is recognisably
broken.


Granted, this problem hardly ever happens. But the argument that a fully
implemented PAL set is inherently immune to color errors is hard for me to
swallow.



I don't think there's a claim that it is inherently immune to all colour
errors, only those caused by consistent differences between the phase of
the subcarrier and the burst.

Sylvia.
  #50   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 246
Default NTSC versus PAL

On 6/04/2010 1:09 AM, Geoffrey S. Mendelson wrote:
Sylvia Else wrote:
If you then negate the resulting red difference component of the second
line, and average with the red difference component of the first line,
the parts received from the blue difference component cancel out,
leaving a red different component that equals the original, multiplied
by the cosine of the phase error. The same applies to the blue
component. The result is that the hues are correct, but not as saturated
as they shoud have been.


Since PAL TV sets have a saturation (color level) control, isn't that
a "non-problem". If it matters, you just adjust it to compensate.


If it's a fixed phase error, yes. If the phase error is changing slowly
over time the the picture will have a saturation that varies over time
which would be annoying if the effect were high enough.

However, I've never noticed such an effect.

Sylvia.


  #51   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default NTSC versus PAL

If the burst just has a fixed phase offset from the true colour
subcarrier, then the averaging will work.


Right. I missed that.


I don't think there's a claim that [PAL] is inherently immune to
all colour errors, only those caused by consistent differences
between the phase of the subcarrier and the burst.



  #52   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 6,772
Default NTSC versus PAL


"Sylvia Else" wrote in message
...
On 6/04/2010 12:53 AM, William Sommerwerck wrote:
However... If the burst phase is wrong, then there is no cancellation
of
errors, because there are no "errors" /in the signal itself/. (Right?

(???))
Therefore, I don't see how line averaging can be used to eliminate the

need
for a manual hue control.


Think of the chroma signal as a vector with its y coordinate equal the
red difference component, and the x coordinate equal to the blue
difference component. A phase error rotates that vector about the z
axis. Effectively, the blue difference component receives a bit of the
red difference component, and vice versa.


On alternate lines the phase of the red difference component *only* is
inverted. In our view, this has the effect of reflecting the vector in
the x axis - what was a positive y value becomes negative.


The same phase error causes this vector to rotate in the same direction
about the z axis, but because of the reflection, the mixing of the
components has the opposite sign.


If you then negate the resulting red difference component of the second
line, and average with the red difference component of the first line,
the parts received from the blue difference component cancel out,
leaving a red different component that equals the original, multiplied
by the cosine of the phase error. The same applies to the blue
component. The result is that the hues are correct, but not as saturated
as they shoud have been.


No argument. That's always been my understanding. But...

If the burst phase gets screwed up somewhere along the line, no amount of
line averaging will fix the problem, because there's nothing "wrong" with
the subcarrier to fix.


If the burst has a random phase relationship to the colour subcarrier on
each line, then my analysis falls apart because the vectors would have
random orientations. In such a situation a PAL receiver would do no better
than NTSC, and they'd both perform awfully.

If the burst just has a fixed phase offset from the true colour
subcarrier, then the averaging will work.

Indeed it will work if the colour subcarrier drifts in a consistent way
relative to the burst - or if the receiver's oscillator similarly drifts.
The effect of such a drift on an NSTC picture would be a variation of tint
from left to right. However, a tint control wouldn't be able to address
that problem - it would simply move the horizontal position on the screen
where the colours are accurate - suggesting that it doesn't occur in
practice except in equipment that is recognisably broken.


Many years back, Bush in the UK produced a colour decoder which was
'revolutionary' compared to other manufacturers' efforts, in that the
subcarrier was regenerated in the decoder directly from the burst, rather
than being a free-running oscillator just locked to the burst with a PLL.
They did this by deriving a phase-adjustable pulse from the H-flyback, and
using this to 'notch out' the burst from the back porch period. The 10
cycles of burst thus recovered, were then applied directly to the 4.43MHz
crystal, which caused it to ring at exactly the same frequency and in
exactly the same phase as the original subcarrier. Always seemed to work
pretty well, and they continued to use this system over a period of probably
10 years or more, covering three chassis designs / revisions.

Arfa


Granted, this problem hardly ever happens. But the argument that a fully
implemented PAL set is inherently immune to color errors is hard for me
to
swallow.



I don't think there's a claim that it is inherently immune to all colour
errors, only those caused by consistent differences between the phase of
the subcarrier and the burst.

Sylvia.



  #53   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 6,772
Default NTSC versus PAL


"Sylvia Else" wrote in message
...
On 6/04/2010 1:09 AM, Geoffrey S. Mendelson wrote:
Sylvia Else wrote:
If you then negate the resulting red difference component of the second
line, and average with the red difference component of the first line,
the parts received from the blue difference component cancel out,
leaving a red different component that equals the original, multiplied
by the cosine of the phase error. The same applies to the blue
component. The result is that the hues are correct, but not as saturated
as they shoud have been.


Since PAL TV sets have a saturation (color level) control, isn't that
a "non-problem". If it matters, you just adjust it to compensate.


If it's a fixed phase error, yes. If the phase error is changing slowly
over time the the picture will have a saturation that varies over time
which would be annoying if the effect were high enough.

However, I've never noticed such an effect.

Sylvia.


I would guess that you never would see such an effect, as all of the
decoders that I can remember working on, had ACC circuits which worked very
well ...

Arfa


  #54   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default NTSC versus PAL

Many years back, Bush in the UK produced a colour decoder which was
'revolutionary' compared to other manufacturers' efforts, in that the
subcarrier was regenerated in the decoder directly from the burst, rather
than being a free-running oscillator just locked to the burst with a PLL.
They did this by deriving a phase-adjustable pulse from the H-flyback, and
using this to 'notch out' the burst from the back porch period. The 10
cycles of burst thus recovered, were then applied directly to the 4.43MHz
crystal, which caused it to ring at exactly the same frequency and in
exactly the same phase as the original subcarrier. Always seemed to work
pretty well, and they continued to use this system over a period of
probably 10 years or more, covering three chassis designs / revisions.


This was first done by GE, circa 1966, in the Portacolor set, mostly because
it was cheaper.

Another way of looking at this system is that the crystal was an extremely
narrow-band filter that removed the "Fourier sidebands" around the
subcarrier frequency created by transmitting the 10-cycle burst only once on
each scanning line.


  #55   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 12,924
Default NTSC versus PAL

Sylvia Else wrote:

On 3/04/2010 10:04 PM, Michael A. Terrell wrote:

isw wrote:

Sylvia Else wrote:

If we were building an analogue colour TV transmission infrastructure
now, then maybe we'd go the NTSC route, since it eliminates the delay
line.



NTSC? No delay line? Moron. The luminance data had to be delayed to
allow time to process the Chroma data. An open delay line in a NTSC
video display caused a very dark image with moving blotches of color. I
found and replaced several, in NTSC TVs and Video Monitors.



In which case you'd know that a PAL TV contains two delay lines. One
provides a short delay and addresses the difference in delay between the
chroma path and the luminance path. The other provides a full scan line
delay to allow averaging of the chrominance signal.

It should be obvious from context that "the" delay line that I was
referring to was the latter.

But I suppose calling people morons is easier than doing your own thinking.



It is, for people who consider one as zero.


--
Service to my country? Been there, Done that, and I've got my DD214 to
prove it.
Member of DAV #85.

Michael A. Terrell
Central Florida

http://www.flickr.com/photos/materrell/


  #56   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 246
Default NTSC versus PAL

On 7/04/2010 10:08 AM, Arfa Daily wrote:
"Sylvia wrote in message
...
On 6/04/2010 12:53 AM, William Sommerwerck wrote:
However... If the burst phase is wrong, then there is no cancellation
of
errors, because there are no "errors" /in the signal itself/. (Right?
(???))
Therefore, I don't see how line averaging can be used to eliminate the
need
for a manual hue control.

Think of the chroma signal as a vector with its y coordinate equal the
red difference component, and the x coordinate equal to the blue
difference component. A phase error rotates that vector about the z
axis. Effectively, the blue difference component receives a bit of the
red difference component, and vice versa.

On alternate lines the phase of the red difference component *only* is
inverted. In our view, this has the effect of reflecting the vector in
the x axis - what was a positive y value becomes negative.

The same phase error causes this vector to rotate in the same direction
about the z axis, but because of the reflection, the mixing of the
components has the opposite sign.

If you then negate the resulting red difference component of the second
line, and average with the red difference component of the first line,
the parts received from the blue difference component cancel out,
leaving a red different component that equals the original, multiplied
by the cosine of the phase error. The same applies to the blue
component. The result is that the hues are correct, but not as saturated
as they shoud have been.

No argument. That's always been my understanding. But...

If the burst phase gets screwed up somewhere along the line, no amount of
line averaging will fix the problem, because there's nothing "wrong" with
the subcarrier to fix.


If the burst has a random phase relationship to the colour subcarrier on
each line, then my analysis falls apart because the vectors would have
random orientations. In such a situation a PAL receiver would do no better
than NTSC, and they'd both perform awfully.

If the burst just has a fixed phase offset from the true colour
subcarrier, then the averaging will work.

Indeed it will work if the colour subcarrier drifts in a consistent way
relative to the burst - or if the receiver's oscillator similarly drifts.
The effect of such a drift on an NSTC picture would be a variation of tint
from left to right. However, a tint control wouldn't be able to address
that problem - it would simply move the horizontal position on the screen
where the colours are accurate - suggesting that it doesn't occur in
practice except in equipment that is recognisably broken.


Many years back, Bush in the UK produced a colour decoder which was
'revolutionary' compared to other manufacturers' efforts, in that the
subcarrier was regenerated in the decoder directly from the burst, rather
than being a free-running oscillator just locked to the burst with a PLL.
They did this by deriving a phase-adjustable pulse from the H-flyback, and
using this to 'notch out' the burst from the back porch period. The 10
cycles of burst thus recovered, were then applied directly to the 4.43MHz
crystal, which caused it to ring at exactly the same frequency and in
exactly the same phase as the original subcarrier. Always seemed to work
pretty well, and they continued to use this system over a period of probably
10 years or more, covering three chassis designs / revisions.

Arfa


I'm left wondering what exactly was the *real* problem that PAL was
intended to fix. It appears that the NTSC tint control could only
address a fixed phase offset between the colour burst and the
subcarrier, with both transmitters and TV sets able to maintain that
offset sufficiently closely that the hue wouldn't vary from left to
right of the picture.

Other issues, such as non-linear phase shift would have been a problem
for NTSC viewers, regardless of the tint control.

So were NTSC viewers tolerating colour pictures that couldn't be set
right even with the tint control? Or is there something else that I've
missed?

Sylvia.



  #57   Report Post  
Posted to sci.electronics.repair
isw isw is offline
external usenet poster
 
Posts: 320
Default NTSC versus PAL

In article ,
Sylvia Else wrote:

--snippety-snip--

I'm left wondering what exactly was the *real* problem that PAL was
intended to fix.


Political. The Europeans didn't want US companies selling sets there.

Isaac
  #58   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 43,017
Default NTSC versus PAL

In article ],
isw wrote:
In article ,
Sylvia Else wrote:


--snippety-snip--


I'm left wondering what exactly was the *real* problem that PAL was
intended to fix.


Political. The Europeans didn't want US companies selling sets there.


Didn't stop the Japanese, etc. But US companies would have to do other
mods to their products for European sales anyway. Like mains voltage and
frequency. Most couldn't be bothered - even when that's all which had to
be changed.

--
*Letting a cat out of the bag is easier than putting it back in *

Dave Plowman London SW
To e-mail, change noise into sound.
  #59   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default NTSC versus PAL

I'm left wondering what exactly was the *real* problem that PAL
was intended to fix. It appears that the NTSC tint control could
only address a fixed phase offset between the colour burst and
the subcarrier, with both transmitters and TV sets able to
maintain that offset sufficiently closely that the hue wouldn't
vary from left to right of the picture.


Correct.


Other issues, such as non-linear phase shift would have been
a problem for NTSC viewers, regardless of the tint control.


Also correct.


So were NTSC viewers tolerating colour pictures that couldn't
be set right even with the tint control? Or is there something
else that I've missed?


You /have/ missed something, which I explained "long ago and far away".
grin

The US TV-distribution system DID NOT generally suffer from non-linear
group-delay problems, whereas the European system DID. That's it.

Even without the extra delay line, there is some degree of visual color
averaging, which tends to mitigate the phase error.


  #60   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default NTSC versus PAL

Political. The Europeans didn't want US companies selling
sets there.


Didn't stop the Japanese, etc. But US companies would have
to do other mods to their products for European sales anyway.
Like mains voltage and frequency. Most couldn't be bothered --
even when that's all which had to be changed.


I don't buy that. US sets would have been fairly expensive in Europe, even
in the mid-60s. Not to mention the strong competition from Thomson, Philips,
etc.




  #61   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 1,247
Default NTSC versus PAL

On Wed, 07 Apr 2010 12:29:40 +1000, Sylvia Else
wrote:

On 7/04/2010 10:08 AM, Arfa Daily wrote:
"Sylvia wrote in message
...
On 6/04/2010 12:53 AM, William Sommerwerck wrote:
However... If the burst phase is wrong, then there is no cancellation
of
errors, because there are no "errors" /in the signal itself/. (Right?
(???))
Therefore, I don't see how line averaging can be used to eliminate the
need
for a manual hue control.

Think of the chroma signal as a vector with its y coordinate equal the
red difference component, and the x coordinate equal to the blue
difference component. A phase error rotates that vector about the z
axis. Effectively, the blue difference component receives a bit of the
red difference component, and vice versa.

On alternate lines the phase of the red difference component *only* is
inverted. In our view, this has the effect of reflecting the vector in
the x axis - what was a positive y value becomes negative.

The same phase error causes this vector to rotate in the same direction
about the z axis, but because of the reflection, the mixing of the
components has the opposite sign.

If you then negate the resulting red difference component of the second
line, and average with the red difference component of the first line,
the parts received from the blue difference component cancel out,
leaving a red different component that equals the original, multiplied
by the cosine of the phase error. The same applies to the blue
component. The result is that the hues are correct, but not as saturated
as they shoud have been.

No argument. That's always been my understanding. But...

If the burst phase gets screwed up somewhere along the line, no amount of
line averaging will fix the problem, because there's nothing "wrong" with
the subcarrier to fix.

If the burst has a random phase relationship to the colour subcarrier on
each line, then my analysis falls apart because the vectors would have
random orientations. In such a situation a PAL receiver would do no better
than NTSC, and they'd both perform awfully.

If the burst just has a fixed phase offset from the true colour
subcarrier, then the averaging will work.

Indeed it will work if the colour subcarrier drifts in a consistent way
relative to the burst - or if the receiver's oscillator similarly drifts.
The effect of such a drift on an NSTC picture would be a variation of tint
from left to right. However, a tint control wouldn't be able to address
that problem - it would simply move the horizontal position on the screen
where the colours are accurate - suggesting that it doesn't occur in
practice except in equipment that is recognisably broken.


Many years back, Bush in the UK produced a colour decoder which was
'revolutionary' compared to other manufacturers' efforts, in that the
subcarrier was regenerated in the decoder directly from the burst, rather
than being a free-running oscillator just locked to the burst with a PLL.
They did this by deriving a phase-adjustable pulse from the H-flyback, and
using this to 'notch out' the burst from the back porch period. The 10
cycles of burst thus recovered, were then applied directly to the 4.43MHz
crystal, which caused it to ring at exactly the same frequency and in
exactly the same phase as the original subcarrier. Always seemed to work
pretty well, and they continued to use this system over a period of probably
10 years or more, covering three chassis designs / revisions.

Arfa


I'm left wondering what exactly was the *real* problem that PAL was
intended to fix. It appears that the NTSC tint control could only
address a fixed phase offset between the colour burst and the
subcarrier, with both transmitters and TV sets able to maintain that
offset sufficiently closely that the hue wouldn't vary from left to
right of the picture.

Other issues, such as non-linear phase shift would have been a problem
for NTSC viewers, regardless of the tint control.

So were NTSC viewers tolerating colour pictures that couldn't be set
right even with the tint control? Or is there something else that I've
missed?

Sylvia.


Part of the difficulity in understanding is that perhaps you don't
have experience with early American color televisions... I certainly
remember how in the 60s we had to adjust the tint control on a regular
(show by show) basis, because of lack of consistancy.

Today, with predominatly digital systems, it has been so long since
I've touched a tint control, that I wonder if they still exist!

Anyone who had one of those old, tube (valve) color sets, with the 21"
round color CRT, will remember seeing green skies, and blue grass
while having skin colors set to the proper shade. Get the sky blue,
and the skin turned red, or blue, or green!
  #62   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default NTSC versus PAL

Part of the difficulity in understanding is that perhaps you
don't have experience with early American color televisions...
I certainly remember how in the 60s we had to adjust the tint
control on a regular (show by show) basis, because of lack
of consistancy.


Yes -- a lack of consistency. That was not the fault of NTSC, but of the
broadcasters.


Anyone who had one of those old, tube (valve) color sets,
with the 21" round color CRT, will remember seeing green
skies, and blue grass while having skin colors set to the
proper shade. Get the sky blue, and the skin turned red,
or blue, or green!


I don't think that's correct. The cameras (and/or encoders) would have had
to have been very badly set up for that to happen.


On a related subject... I remember reading long, long ago that the first RCA
color TV had /four/ controls for adjusting the color, which the author
described as a "combination lock"! Anyone know anything about this?


  #63   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 246
Default NTSC versus PAL

On 7/04/2010 10:12 PM, William Sommerwerck wrote:
I'm left wondering what exactly was the *real* problem that PAL
was intended to fix. It appears that the NTSC tint control could
only address a fixed phase offset between the colour burst and
the subcarrier, with both transmitters and TV sets able to
maintain that offset sufficiently closely that the hue wouldn't
vary from left to right of the picture.


Correct.


Other issues, such as non-linear phase shift would have been
a problem for NTSC viewers, regardless of the tint control.


Also correct.


So were NTSC viewers tolerating colour pictures that couldn't
be set right even with the tint control? Or is there something
else that I've missed?


You /have/ missed something, which I explained "long ago and far away".
grin


OK, I vaguely remember your saying that now.

In the UK, colour was only transmitted on a new 625 line service
(newish, in the case of BBC2), in parallel for a long time with a
monochrome 405 line service (except BBC2), and I'd have thought the new
transmission infrastructure could have been built to obviate the
non-linear group-delay, given that it existed in the USA.

And, as I commented before, the Sony Trinitron sets, which didn't
implement PAL, performed acceptably according to my memory.

Sylvia.
  #64   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 246
Default NTSC versus PAL

On 7/04/2010 11:23 PM, William Sommerwerck wrote:
Part of the difficulity in understanding is that perhaps you
don't have experience with early American color televisions...
I certainly remember how in the 60s we had to adjust the tint
control on a regular (show by show) basis, because of lack
of consistancy.


Yes -- a lack of consistency. That was not the fault of NTSC, but of the
broadcasters.


I have to wonder what the broadcasters were doing to achieve that.
Contriving to get the colour burst phase consistent amongst cameras in a
studio (so that the tint stayed the same for a show), but inconsistent
with the actual colour subcarrier, would take some doing.

Sylvia.
  #65   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default NTSC versus PAL

In the UK, colour was only transmitted on a new 625-line
service (newish, in the case of BBC2), in parallel for a long
time with a monochrome 405 line service (except BBC2),
and I'd have thought the new transmission infrastructure
could have been built to obviate the non-linear group-delay,
given that it existed in the USA.


You're probably correct.




  #66   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default NTSC versus PAL

Yes -- a lack of consistency. That was not the fault
of NTSC, but of the broadcasters.


I have to wonder what the broadcasters were doing to achieve
that. Contriving to get the colour burst phase consistent amongst
cameras in a studio (so that the tint stayed the same for a show),
but inconsistent with the actual colour subcarrier, would take
some doing.


There is no subcarrier or burst signal in the cameras. They aren't needed at
that point, and are added during the encoding process.

Setting them up is another matter. The early episodes of "Barney Miller"
provide a good example of poor setup, with inconsistent color, and poor
convergence.


  #67   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 246
Default NTSC versus PAL

On 8/04/2010 12:21 AM, William Sommerwerck wrote:
Yes -- a lack of consistency. That was not the fault
of NTSC, but of the broadcasters.


I have to wonder what the broadcasters were doing to achieve
that. Contriving to get the colour burst phase consistent amongst
cameras in a studio (so that the tint stayed the same for a show),
but inconsistent with the actual colour subcarrier, would take
some doing.


There is no subcarrier or burst signal in the cameras. They aren't needed at
that point, and are added during the encoding process.


Ok, so the separate colour signals (and luminance?) are sent from the
cameras. Still, at some point the colour signals have to be encoded
using the colour subcarrier, and a bit of the latter has to be included
as the burst. Failing to keep them in phase would require a considerable
amount of indifference.

Which I think you've also said



Setting them up is another matter. The early episodes of "Barney Miller"
provide a good example of poor setup, with inconsistent color, and poor
convergence.


Poor convergence? The mind boggles.

Sylvia.
  #68   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 12,924
Default NTSC versus PAL


William Sommerwerck wrote:

Part of the difficulity in understanding is that perhaps you
don't have experience with early American color televisions...
I certainly remember how in the 60s we had to adjust the tint
control on a regular (show by show) basis, because of lack
of consistancy.


Yes -- a lack of consistency. That was not the fault of NTSC, but of the
broadcasters.



And AT&T who provided the coaxial cables that fed the video to all
the stations on a network. The tint and chroma level could be adjusted
at every facility in the system. I knew someone who worked for AT&T at
the time, and he told me what a pain it was to compensate for the
cable. When the network switched to a different studio or city for a
show, it threw everything out of calibration.


Anyone who had one of those old, tube (valve) color sets,
with the 21" round color CRT, will remember seeing green
skies, and blue grass while having skin colors set to the
proper shade. Get the sky blue, and the skin turned red,
or blue, or green!


I don't think that's correct. The cameras (and/or encoders) would have had
to have been very badly set up for that to happen.

On a related subject... I remember reading long, long ago that the first RCA
color TV had /four/ controls for adjusting the color, which the author
described as a "combination lock"! Anyone know anything about this?



He may be talking about the three 'drive' controls that set the gain
for each channel. These are set up to provide equal gain to get a white
line during setup. They are service adjustments on TVs, but on an early
design they may have been easier to get to. Some TVs still had hollow
plastic shaft extenders that passed through the rear of floor model
cabinets to adjust these and other pots.

The fourth would be the actual dolor intensity control.


--
Lead free solder is Belgium's version of 'Hold my beer and watch this!'
  #69   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default NTSC versus PAL

Setting them up is another matter. The early episodes of
"Barney Miller" provide a good example of poor setup, with
inconsistent color, and poor convergence.


Poor convergence? The mind boggles.


Oh, yes. The pickups had to be aligned. The "modern" system, in which
solid-state sensors are attached to a prism/beamsplitter was not practical
with vidicons and Plumbicons.


  #70   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default NTSC versus PAL

On a related subject... I remember reading long, long ago
that the first RCA color TV had /four/ controls for adjusting
the color, which the author described as a "combination lock"!
Anyone know anything about this?


He may be talking about the three 'drive' controls that set the
gain for each channel. These are set up to provide equal gain
to get a white line during setup. They are service adjustments
on TVs, but on an early design they may have been easier to get to.


No, these were supposedly user controls. Anybody got a photo of the user
controls for a CT-100?




  #71   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 43,017
Default NTSC versus PAL

In article ,
William Sommerwerck wrote:
There is no subcarrier or burst signal in the cameras. They aren't
needed at that point, and are added during the encoding process.


Setting them up is another matter. The early episodes of "Barney Miller"
provide a good example of poor setup, with inconsistent color, and poor
convergence.


So camera setup was poor - as was the later stages of transmission?

This certainly wasn't the case in the UK - despite the transmitters being
fed with land lines.

--
*Where do forest rangers go to "get away from it all?"

Dave Plowman London SW
To e-mail, change noise into sound.
  #72   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 43,017
Default NTSC versus PAL

In article ,
William Sommerwerck wrote:
Setting them up is another matter. The early episodes of
"Barney Miller" provide a good example of poor setup, with
inconsistent color, and poor convergence.


Poor convergence? The mind boggles.


Oh, yes. The pickups had to be aligned. The "modern" system, in which
solid-state sensors are attached to a prism/beamsplitter was not
practical with vidicons and Plumbicons.


Registration on cameras. Convergence on monitors?

Did you have videcon colour cameras? First UK ones were plumbicon. Apart
from the ancient IO RCA ones used for tests.

--
*24 hours in a day ... 24 beers in a case ... coincidence? *

Dave Plowman London SW
To e-mail, change noise into sound.
  #73   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default NTSC versus PAL

Registration on cameras. Convergence on monitors?

Yes. Thanks for the correction.


Did you have videcon colour cameras? First UK ones
were Plumbicon.


Yes, because you started so late.

The first RCA cameras used vidicons (I think) -- though they might have used
image orhticons.

They later had a four-pickup camera that used an image orthicon to generate
a perfectly registered (by definition) luminance signal, plus three
vidicons.


  #74   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 12,924
Default NTSC versus PAL


William Sommerwerck wrote:

Setting them up is another matter. The early episodes of
"Barney Miller" provide a good example of poor setup, with
inconsistent color, and poor convergence.


Poor convergence? The mind boggles.


Oh, yes. The pickups had to be aligned. The "modern" system, in which
solid-state sensors are attached to a prism/beamsplitter was not practical
with vidicons and Plumbicons.



Local stations weren't immune, either. Some locally produced shows
in Dayton, ohio aired from poorly converged cameras in the '70s & '80s


--
Lead free solder is Belgium's version of 'Hold my beer and watch this!'
  #75   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 43,017
Default NTSC versus PAL

In article ,
William Sommerwerck wrote:
Registration on cameras. Convergence on monitors?


Yes. Thanks for the correction.



Did you have videcon colour cameras? First UK ones
were Plumbicon.


Yes, because you started so late.


The first RCA cameras used vidicons (I think) -- though they might have
used image orhticons.


Three 3 inch IO were the ones I remember. Being used for tests long before
colour broadcasting started in the UK.

They later had a four-pickup camera that used an image orthicon to
generate a perfectly registered (by definition) luminance signal, plus
three vidicons.


That's a configuration I never saw. The first colour cameras here were all
four tube plumblicons. I was taught the colour response of a videcon
wasn't suitable.

BTW I'm not surprised your setup engineers had problems - with a mixture
of IO and videcon. ;-)

--
*Santa's helpers are subordinate clauses*

Dave Plowman London SW
To e-mail, change noise into sound.


  #76   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 12,924
Default NTSC versus PAL


"Dave Plowman (News)" wrote:

In article ,
William Sommerwerck wrote:
Registration on cameras. Convergence on monitors?


Yes. Thanks for the correction.


Did you have videcon colour cameras? First UK ones
were Plumbicon.


Yes, because you started so late.


The first RCA cameras used vidicons (I think) -- though they might have
used image orhticons.


Three 3 inch IO were the ones I remember. Being used for tests long before
colour broadcasting started in the UK.

They later had a four-pickup camera that used an image orthicon to
generate a perfectly registered (by definition) luminance signal, plus
three vidicons.


That's a configuration I never saw. The first colour cameras here were all
four tube plumblicons. I was taught the colour response of a videcon
wasn't suitable.



RCA built their TK44 color studio cameras with Vidicons. They
changed the model number to TK46 when they switched to Plumicons. Most
of the parts were interchangeable, so I used a pair of TK44 cameras for
spare modules & as a test jig to keep three TK46 cameras working the way
we wanted. The TK44s were used by TV stations for years, but needed
brighter studio lighting.


--
Lead free solder is Belgium's version of 'Hold my beer and watch this!'
  #77   Report Post  
Posted to sci.electronics.repair
isw isw is offline
external usenet poster
 
Posts: 320
Default NTSC versus PAL

In article ,
"Dave Plowman (News)" wrote:

In article ],
isw wrote:
In article ,
Sylvia Else wrote:


--snippety-snip--


I'm left wondering what exactly was the *real* problem that PAL was
intended to fix.


Political. The Europeans didn't want US companies selling sets there.


Didn't stop the Japanese, etc.


But *they* wanted to sell sets *here*.

Isaac
  #78   Report Post  
Posted to sci.electronics.repair
isw isw is offline
external usenet poster
 
Posts: 320
Default NTSC versus PAL

In article ,
"William Sommerwerck" wrote:

Registration on cameras. Convergence on monitors?


Yes. Thanks for the correction.


Did you have videcon colour cameras? First UK ones
were Plumbicon.


Yes, because you started so late.

The first RCA cameras used vidicons (I think) -- though they might have used
image orhticons.


Iconoscope first, then orthicon, then image orthicon. Vidicons were
first used for film chains, and later as the color (as opposed to
luminance) pickups in *some* cameras.

Isaac
  #79   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 475
Default NTSC versus PAL

On Apr 7, 5:26*pm, "Michael A. Terrell"
wrote:
"Dave Plowman (News)" wrote:

In article ,
* *William Sommerwerck wrote:
Registration on cameras. Convergence on monitors?


Yes. Thanks for the correction.


Did you have videcon colour cameras? First UK ones
were Plumbicon.


Yes, because you started so late.


The first RCA cameras used vidicons (I think) -- though they might have
used image orhticons.


Three 3 inch IO were the ones I remember. Being used for tests long before
colour broadcasting started in the UK.


They later had a four-pickup camera that used an image orthicon to
generate a perfectly registered (by definition) luminance signal, plus
three vidicons.


That's a configuration I never saw. The first colour cameras here were all
four tube plumblicons. I was taught the colour response of a

videcon
wasn't suitable.


* *RCA built their TK44 color studio cameras with Vidicons. *They
changed the model number to TK46 when they switched to Plumicons.

*Most
of the parts were interchangeable, so I used a pair of TK44 cameras

for
spare modules & as a test jig to keep three TK46 cameras working

the way
we wanted. *The TK44s were used by TV stations for years, but

needed
brighter studio lighting.

--
Lead free solder is Belgium's version of 'Hold my beer and watch

this!'

I thought TK-44s had plumbs. I _know_ that TK-45s had plumbs as I have
a used one from a TK-45. The TK-28 film camera had vidicons but AIUI,
the vidicon had its own level non-linearity that was not present in
plumbicons (Leddicons for you EEV fans) or Saticons. Vidicons required
different electronic gamma to achieve an overall gamma of 2 to 2.2.
For a film camera the vidicon issue wasn't as bad as the light levels
were much more predicable. You can look at some of the dinosaurs here.

http://www.oldradio.com/archives/hardware/TV/RCA-TV.htm


  #80   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 12,924
Default NTSC versus PAL

wrote:

On Apr 7, 5:26 pm, "Michael A. Terrell"
wrote:
"Dave Plowman (News)" wrote:

In article ,
William Sommerwerck wrote:
Registration on cameras. Convergence on monitors?


Yes. Thanks for the correction.


Did you have videcon colour cameras? First UK ones
were Plumbicon.


Yes, because you started so late.


The first RCA cameras used vidicons (I think) -- though they might have
used image orhticons.


Three 3 inch IO were the ones I remember. Being used for tests long before
colour broadcasting started in the UK.


They later had a four-pickup camera that used an image orthicon to
generate a perfectly registered (by definition) luminance signal, plus
three vidicons.


That's a configuration I never saw. The first colour cameras here were all
four tube plumblicons. I was taught the colour response of a

videcon
wasn't suitable.


RCA built their TK44 color studio cameras with Vidicons. They
changed the model number to TK46 when they switched to Plumicons.

Most
of the parts were interchangeable, so I used a pair of TK44 cameras

for
spare modules & as a test jig to keep three TK46 cameras working

the way
we wanted. The TK44s were used by TV stations for years, but

needed
brighter studio lighting.

--
Lead free solder is Belgium's version of 'Hold my beer and watch

this!'

I thought TK-44s had plumbs.



There were conversion kits, according to the manuals I had for the
44s


I _know_ that TK-45s had plumbs as I have
a used one from a TK-45. The TK-28 film camera had vidicons but AIUI,
the vidicon had its own level non-linearity that was not present in
plumbicons (Leddicons for you EEV fans) or Saticons. Vidicons required
different electronic gamma to achieve an overall gamma of 2 to 2.2.
For a film camera the vidicon issue wasn't as bad as the light levels
were much more predicable.



That was why you needed more light in the studio for the Vidicons. It
pushed them into a more linear area of operation. The savings on
lighting costs and air conditioning quickly paid the conversion costs.
The ones we had were from a private studio used a few times a year, to
make commercials by an eccentric old man.


You can look at some of the dinosaurs here.

http://www.oldradio.com/archives/hardware/TV/RCA-TV.htm


--
Service to my country? Been there, Done that, and I've got my DD214 to
prove it.
Member of DAV #85.

Michael A. Terrell
Central Florida

http://www.flickr.com/photos/materrell/
Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Kero versus propane versus natural gas for heat Stormin Mormon Home Repair 1 October 27th 05 04:12 PM
Dadonator versus Forrest versus Freud -- comparisons Never Enough Money Woodworking 7 July 21st 05 06:45 PM
NTSC in Europe [email protected] Electronics Repair 8 February 21st 05 09:02 PM
NTSC-compatible video signal circuit - ntsc.pdf (0/1) Rico Rivera Electronics 1 April 13th 04 06:57 PM
Heat pump versus oil versus propane in southern NH Gerome Home Ownership 3 October 7th 03 05:26 PM


All times are GMT +1. The time now is 05:21 PM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 DIYbanter.
The comments are property of their posters.
 

About Us

"It's about DIY & home improvement"