View Single Post
  #53   Report Post  
Posted to sci.electronics.repair
Geoffrey S. Mendelson[_2_] Geoffrey S. Mendelson[_2_] is offline
external usenet poster
 
Posts: 57
Default 120hz versus 240hz

Arfa Daily wrote:
Well, I dunno how else to put it. I'm just telling it as I was taught it
many years ago at college. I was taught that short persistence phosphors
were used on TV display CRTs, to prevent motion blur, and that the
individual images were integrated into a moving image, by persistence of
vision, not phosphor. I was also taught that the combination of POV and the
short decay time of the phosphor, led to a perceived flicker in the low
frame-rate images, so the technique of splitting each frame into two
interlaced fields, transmitted sequentially, was then born, which totally
overcame this shortcoming of the system. Always made sense to me. Always
made sense also that any replacement technology had to obey the same 'rule'
of putting the still image up very quickly, and not leaving it there long,
to achieve the same result.


It's more complicated than that. You only see one image, which has been
created in your brain from several sources. The most information comes
from the rods in your eyes, they are light level (monochromatic) sensors,
as it were and they are the most prevalent. This means most of what you see
is from the combination of two sets of monochrome images with slightly
to wildly different information.

Then there are the cones, or color sensors. There are far less of them
and they are less sensitive to light, which is why night vision is black
and white.

There are also blind spots where the optic nerves attach to the retina.

None of these show up on their own, they are all integrated into the one
image you see. You never notice that you have two blind spots, you don't
notice the lack of clarity in colors (due to the fewer number of spots)
and rarely, if ever do you notice the difference between your eyes.

If you were for example to need glasses in one eye and not the other, or have
not quite properly prescibed lenses, your image will appear sharp, not blurred
on one side and sharp on the other.

Lots of tricks have been used over the years to take advantage of the
limitations of the "equipment" and the process. For example, anything faster
than 24 frames a second is not perceived as being discrete images, but one
smooth image.

The 50 and 60 fields per second (a field being half an interlaced frame) were
chosen not because they needed to be that fast (48 would have done), but to
eliminate interefence effects from electrical lights.

Color is another issue. The NTSC (and later adopted by the BBC for PAL)
determined that a 4:1 color system was good enough, i.e. color information
only needed to be changed (and recorded) at 1/4 the speed of the light level.

In modern terms, it means that for every 4 pixels, you only have to have
color information once. Your eye can resolve the difference in light levels,
but not in colors.

This persists to this day, MPEG type encoding is based on that, it's not
redgreenblue, redgreenblue, redgreenblue, redgreenblue of a still
picture or a computer screen, it's the lightlevel, lightlevel,
lightlevel, lightlevel colorforallfour encoding that was used by NTSC
and PAL.

In the end, IMHO, it's not frame rates, color encoding methods, at all, as
they were fixed around 1960 and not changed, but it is display technology as
your brain perceives it.

No matter what anyone says here, it's the combination of exact implementation
of display technology, and your brain that matter. If the combination
looks good, and you are comfortable watching it, a 25 fps CRT, or a 100FPS
LED screen, or even a 1000 FPS display, if there was such a thing, would look
good if everything combined produce good images in YOUR brain, and
bad if some combination produces something "wrong".



If you think about it, the only 'real' difference between an LCD panel, and
a plasma panel, is the switching time of the individual elements. On the LCD
panel, this is relatively long, whereas on the plasma panel, it is short.
The LCD panel suffers from motion blur, but the plasma panel doesn't. Ergo,
it's the element switching time which causes this effect ... ??


There is more to that too. An LCD is like a shutter. It pivots on its
axis and is either open or closed. Not really, there is a discrete
time from closed (black) to open (lit) and therefore a build up of
brightness.

Plasma displays are gas discharge devices, they only glow when there is enough
voltage to "fire" them until it drops below the level needed to sustain
the glow. That depends more upon the speed of the control electronics than
any (other) laws of physics, visocsity of the medium the crystals are in,
temperature, etc.

That's the aim of LED backlit TV screens (besides less power consumption, heat,
etc). They only are lit when the crystals are "open", so there is no time
where you see partially lit "pixels".

Geoff.


--
Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM
New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or
understanding, as in he has a sub-wikipedia understanding of the situation.
i.e possessing less facts or information than can be found in the Wikipedia.