View Single Post
  #50   Report Post  
Posted to sci.electronics.repair
Sylvia Else Sylvia Else is offline
external usenet poster
 
Posts: 246
Default 120hz versus 240hz

On 1/03/2010 12:17 PM, Arfa Daily wrote:
"Sylvia wrote in message
...
On 28/02/2010 12:12 AM, Arfa Daily wrote:
snip


"Sylvia wrote in message
...

Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen in
character sizes steps, the eye perceived phantom squares at intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to
follow
the square, but the square itself was moving in discrete steps. So the
eye
was causing the image of the square to be smeared across the retina. I
was
seeing this effect on a CRT screen, but the longer the persistence of
the
image on the screen the worse the effect would be. Interpolating the
position of the image on the screen would reduce that effect.

However, I can't explain why this would be less pronounced on a plasma
screen.



Because LCD cells are painfully slow at switching, which equates to a
long
persistence phosphor on a CRT, which as you say yourself, makes the
effect
worse. Plasma cells are very fast, particularly now that 'pre-fire'
techniques are used to 'ready' them for switching. This is the equivalent
of
having a very short persistence phosphor on a CRT. If you arrange for the
drive electronics to be able to deliver the cell drives with no more
delay
than the cells themselves are contributing, then the result will be
smooth
motion without any perceivable blur, which is pretty much how it was with
a
standard domestic CRT based CTV.

Arfa


It seems to me that the effect would be visible on any display that has
any degree of persistence. Even if LCDs switched instantaneously, they'd
still be displaying the image for the full frame time and then
instantaneously switching to the next image. This would produce the
smearing effect in the way I've described. To avoid it, one needs a
display that produces a short bright flash at, say, the beginning of the
display period, and remains dark for the rest of the time. As I understand
plasma displays, that's not how they work.

Sylvia.



I think you are mis-understanding the principles involved here in producing
a picture perceived to have smooth smear-free movement, from a sequence of
still images. Any medium which does this, needs to get the image in place as
quickly as possible, and for a time shorter than the period required to get
the next picture in place. This is true of a cinema picture, a CRT
television picture, an LCD television picture, or a plasma or OLED picture.
Making these still images into a perceived moving image, has nothing to do
with the persistence of the phosphor, but is a function of retinal
retention, or 'persistence of vision'.


The fact that a sequence of still images are perceived as a moving
picture is clearly a consequence of visual persistence. And it's obvious
that things will look bad if the images actually overlap. But that's not
what we're discussing.

We're discussing why certain types of display don't do such a good job
despite having a reasonably sharp transition from one image to the next.

The Wikipedia article you cited said that even LCD switching times of
2ms are not good enough "because the pixel will still be switching while
the frame is being displayed." I find this less than convicing as an
explanation. So what if the pixel is switching while the frame is being
displayed? It's not as if the eye has a shutter, and the transition time
is much less that the eye's persistence time anyway.

Sylvia.