View Single Post
  #45   Report Post  
Posted to sci.electronics.repair
Sylvia Else Sylvia Else is offline
external usenet poster
 
Posts: 246
Default 120hz versus 240hz

On 27/02/2010 1:22 AM, William Sommerwerck wrote:
LCDs don't flicker anyway, regardless of their framerate. The frame
rate issue relates to addressing the judder you get as a result of
the image consisting of a sequence of discrete images, rather than
one that continously varies.


Not quite, otherwise the issue would occur with plasma displays. Indeed, it
would with any moving-image recording system.

The problem is that LCDs don't respond "instantaneously". They take a finite
time to go from opaque to the desired transmission level, and then back
again. The result is that the image can lag and "smear". (25 years ago, the
first pocket LCD color TVs from Casio had terrible smear, which added an
oddly "artistic" quality to sports.)

For reasons not clear to me, adding interpolated images reduces the smear.
This makes absolutely no sense whatever, as the LCD now has /less/ time to
switch. I've never gotten an answer on this.


Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen in
character sizes steps, the eye perceived phantom squares at intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to
follow the square, but the square itself was moving in discrete steps.
So the eye was causing the image of the square to be smeared across the
retina. I was seeing this effect on a CRT screen, but the longer the
persistence of the image on the screen the worse the effect would be.
Interpolating the position of the image on the screen would reduce that
effect.

However, I can't explain why this would be less pronounced on a plasma
screen.


It doesn't help that much TV material that was recorded on film is
transmitted with with odd and even interlaced frames that are scans of
the same underlying image (or some variation thereon), so that the
effective refresh rate considerably lower that the interlaced rate.


Interlaced images can be de-interlaced. Note that most product reviews test
displays for how well they do this.


They have to be deinterlaced for display on any screen with significant
persistence, but deinterlacing doesn't increase the underlying frame rate.

Sylvia.