Electronics Repair (sci.electronics.repair) Discussion of repairing electronic equipment. Topics include requests for assistance, where to obtain servicing information and parts, techniques for diagnosis and repair, and annecdotes about success, failures and problems.

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #41   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,469
Default 120hz versus 240hz

On 2/26/2010 8:46 AM Meat Plow spake thus:

On Thu, 25 Feb 2010 15:47:16 -0800, "William Sommerwerck"
wrote:

Ignoring the fact that colour displays are finely tuned
to the way that human colour vision works, and an alien
would likely wonder what we'd been smoking.


This has nothing whatever to do with color rendition.

Who is Sylvia, anyway?


A troll?


Why would you jump to that conclusion?

Oh, forgot; that's what you do.


--
You were wrong, and I'm man enough to admit it.

- a Usenet "apology"
  #42   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 265
Default 120hz versus 240hz

"Phil Allison" wrote in message
...

"Arfa Daily"
"Phil Allison"
"William Sommer****** IDIOT "

First, the only televisions that use LEDs use OLEDs. There are none
using
conventional LEDs.


** Fraid " LED TVs " are on sale all over the world right now.

****WIT !!

http://en.wikipedia.org/wiki/LED-backlit_LCD_television



Your Wiki reference says it all. These are NOT LED televisions,


** But they are called " LED TVs " by their makers and so are

*KNOWN BY THAT NAME* to members of the public.


Fools like YOU and Sommer****** would complain that a bottle of "Steak
Sauce" contained no steak.



.... Phil


I guess we should refer to all LCD sets by their backlight type. That makes
the one on my wall a CCFL TV. And I guess all of those DLP, LCD, DiLA,
SXRD, and LCoS projection sets should be called mercury vapor or whatever
type of lamp they use. And the new projectors could be called LED
projectors as well, even if they are DLP devices.

The point is that referring to the set by the type of backlight it uses is
very misleading and is causing much confusion in the marketplace.

Leonard

  #43   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 1,716
Default 120hz versus 240hz


Meat Plow


** And if you put the remark back into its context - what it IS relevant
to becomes obvious.;


No it doesn't.



** Yes it does - you ASD ****ED TENTH WIT !




  #44   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 6,772
Default 120hz versus 240hz


Meat Plow wrote in message ...
On Fri, 26 Feb 2010 02:32:38 -0000, "Arfa Daily"
wrote:


"William Sommerwerck" wrote in message
...
First, the only televisions that use LEDs use OLEDs. There are none
using
conventional LEDs.

Second, there are no strict definitions of what these refresh rates
mean.
In
some cases, the set generates an interpolated image at that rate, in
others,
a blank (black) raster is inserted. Some sets combine both.

I don't like this enhancement (which was one of the reasons I bought a
plasma set). It has a nasty side-effect -- it makes motion pictures look
like video. This might be fine for a TV show; it isn't when you're
watching
movies. Be sure that whatever set you purchase has some way of defeating
it
the enhancement.

You need to actually look at the sets you're considering with program
material you're familiar with.



Seconded on all counts, and also the reason that I recently bought a
plasma
TV (Panasonic, 50" full HD panel, 400Hz). I have not seen a single thing
about this TV that I don't like so far, unlike the LCD TVs that I have in
the house, and the LCDs that cross my bench for repair, all of which
suffer
from motion artifacts, scaling artifacts, and motion blur ...

This plasma TV has produced absolutely stunning HD pictures from the
Winter
Olymics, with not the slightest sign of motion artifacts of any
description,
even on the fastest content like downhill skiing, and bobsleigh etc. In
contrast, the same content that I have seen on LCDs, has been perfectly
dreadful.

Arfa


Maybe I'm not picky but those motion artifacts just aren't important
enough for me to want to spend thousands on something that doesn't
produce them. I have a fairly cheep 32" and while it does produce some
artifacts they are insignificant to the overall performance.



But the point is that you no longer have to pay thousands to get that
performance. The plasma that I recently bought was little more to buy than a
'good' LCD, but the performance is easily a whole order of magnitude higher.
CRT sets did not suffer from motion artifacts, and I wasn't prepared to
'downgrade' my viewing experience by buying something which did. The LCD
that I have in my kitchen, also 32" and also 'fairly cheap', does suffer
from motion artifacts which are particularly noticeable on high speed stuff
like the winter olympics. I actually do find these significant and annoying,
and I would not consider having to put up with such a picture on my main TV.
Fortunately, the latest generation of affordable plasmas, means that I don't
have to :-)

Arfa


  #45   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 246
Default 120hz versus 240hz

On 27/02/2010 1:22 AM, William Sommerwerck wrote:
LCDs don't flicker anyway, regardless of their framerate. The frame
rate issue relates to addressing the judder you get as a result of
the image consisting of a sequence of discrete images, rather than
one that continously varies.


Not quite, otherwise the issue would occur with plasma displays. Indeed, it
would with any moving-image recording system.

The problem is that LCDs don't respond "instantaneously". They take a finite
time to go from opaque to the desired transmission level, and then back
again. The result is that the image can lag and "smear". (25 years ago, the
first pocket LCD color TVs from Casio had terrible smear, which added an
oddly "artistic" quality to sports.)

For reasons not clear to me, adding interpolated images reduces the smear.
This makes absolutely no sense whatever, as the LCD now has /less/ time to
switch. I've never gotten an answer on this.


Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen in
character sizes steps, the eye perceived phantom squares at intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to
follow the square, but the square itself was moving in discrete steps.
So the eye was causing the image of the square to be smeared across the
retina. I was seeing this effect on a CRT screen, but the longer the
persistence of the image on the screen the worse the effect would be.
Interpolating the position of the image on the screen would reduce that
effect.

However, I can't explain why this would be less pronounced on a plasma
screen.


It doesn't help that much TV material that was recorded on film is
transmitted with with odd and even interlaced frames that are scans of
the same underlying image (or some variation thereon), so that the
effective refresh rate considerably lower that the interlaced rate.


Interlaced images can be de-interlaced. Note that most product reviews test
displays for how well they do this.


They have to be deinterlaced for display on any screen with significant
persistence, but deinterlacing doesn't increase the underlying frame rate.

Sylvia.



  #46   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 6,772
Default 120hz versus 240hz

snip


"Sylvia Else" wrote in message
...

Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen in
character sizes steps, the eye perceived phantom squares at intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to follow
the square, but the square itself was moving in discrete steps. So the eye
was causing the image of the square to be smeared across the retina. I was
seeing this effect on a CRT screen, but the longer the persistence of the
image on the screen the worse the effect would be. Interpolating the
position of the image on the screen would reduce that effect.

However, I can't explain why this would be less pronounced on a plasma
screen.




Because LCD cells are painfully slow at switching, which equates to a long
persistence phosphor on a CRT, which as you say yourself, makes the effect
worse. Plasma cells are very fast, particularly now that 'pre-fire'
techniques are used to 'ready' them for switching. This is the equivalent of
having a very short persistence phosphor on a CRT. If you arrange for the
drive electronics to be able to deliver the cell drives with no more delay
than the cells themselves are contributing, then the result will be smooth
motion without any perceivable blur, which is pretty much how it was with a
standard domestic CRT based CTV.

Arfa




It doesn't help that much TV material that was recorded on film is
transmitted with with odd and even interlaced frames that are scans of
the same underlying image (or some variation thereon), so that the
effective refresh rate considerably lower that the interlaced rate.


Interlaced images can be de-interlaced. Note that most product reviews
test
displays for how well they do this.


They have to be deinterlaced for display on any screen with significant
persistence, but deinterlacing doesn't increase the underlying frame rate.

Sylvia.



  #47   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 246
Default 120hz versus 240hz

On 28/02/2010 12:12 AM, Arfa Daily wrote:
snip


"Sylvia wrote in message
...

Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen in
character sizes steps, the eye perceived phantom squares at intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to follow
the square, but the square itself was moving in discrete steps. So the eye
was causing the image of the square to be smeared across the retina. I was
seeing this effect on a CRT screen, but the longer the persistence of the
image on the screen the worse the effect would be. Interpolating the
position of the image on the screen would reduce that effect.

However, I can't explain why this would be less pronounced on a plasma
screen.




Because LCD cells are painfully slow at switching, which equates to a long
persistence phosphor on a CRT, which as you say yourself, makes the effect
worse. Plasma cells are very fast, particularly now that 'pre-fire'
techniques are used to 'ready' them for switching. This is the equivalent of
having a very short persistence phosphor on a CRT. If you arrange for the
drive electronics to be able to deliver the cell drives with no more delay
than the cells themselves are contributing, then the result will be smooth
motion without any perceivable blur, which is pretty much how it was with a
standard domestic CRT based CTV.

Arfa


It seems to me that the effect would be visible on any display that has
any degree of persistence. Even if LCDs switched instantaneously, they'd
still be displaying the image for the full frame time and then
instantaneously switching to the next image. This would produce the
smearing effect in the way I've described. To avoid it, one needs a
display that produces a short bright flash at, say, the beginning of the
display period, and remains dark for the rest of the time. As I
understand plasma displays, that's not how they work.

Sylvia.
  #48   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 81
Default 120hz versus 240hz

In article ,
Arfa Daily wrote:
You should also be aware that there are several 'resolutions' of screen and
drive to take into consideration. Almost all TV showrooms both here and in
the US, tend to have the sets running on at least an HD picture, and often a
BluRay picture. This makes them look very good at first glance. Problem is
that in normal day to day use when you get it back home, you are going to be
watching standard resolution terrestrial broadcasts on it, and on many sets,
these look pretty dreadful, and it is the reason that so many people are
disappointed with their purchase when they get it home, and think that it is
not what they saw in the store.


At least in my area (Seattle), all the mainstream over the air stations
are HD, now. (Each station has its own transmitter, so they don't have
shared "multiplexes" like in the UK). The typical situation is an
HD subchannel with the main program and one SD subchannel with some
secondary service (sports, weather, old movies, old TV show, or an
SD copy of the main signal to feed analog cable).

There's a smaller number of stations that trade selection for resolution,
running four or five SD subchannels. Christian broadcasting and
speciality stuff like the secondary channels on the other stations.

The only way you'd get stuck with standard definition on the national
networks is to still be using analog cable. (I'm not familiar with
what you get with the two (subscription only) national direct broadcast
satellite providers).

Mark Zenier
Googleproofaddress(account:mzenier provider:eskimo domain:com)




  #49   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 6,772
Default 120hz versus 240hz


"Sylvia Else" wrote in message
...
On 28/02/2010 12:12 AM, Arfa Daily wrote:
snip


"Sylvia wrote in message
...

Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen in
character sizes steps, the eye perceived phantom squares at intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to
follow
the square, but the square itself was moving in discrete steps. So the
eye
was causing the image of the square to be smeared across the retina. I
was
seeing this effect on a CRT screen, but the longer the persistence of
the
image on the screen the worse the effect would be. Interpolating the
position of the image on the screen would reduce that effect.

However, I can't explain why this would be less pronounced on a plasma
screen.




Because LCD cells are painfully slow at switching, which equates to a
long
persistence phosphor on a CRT, which as you say yourself, makes the
effect
worse. Plasma cells are very fast, particularly now that 'pre-fire'
techniques are used to 'ready' them for switching. This is the equivalent
of
having a very short persistence phosphor on a CRT. If you arrange for the
drive electronics to be able to deliver the cell drives with no more
delay
than the cells themselves are contributing, then the result will be
smooth
motion without any perceivable blur, which is pretty much how it was with
a
standard domestic CRT based CTV.

Arfa


It seems to me that the effect would be visible on any display that has
any degree of persistence. Even if LCDs switched instantaneously, they'd
still be displaying the image for the full frame time and then
instantaneously switching to the next image. This would produce the
smearing effect in the way I've described. To avoid it, one needs a
display that produces a short bright flash at, say, the beginning of the
display period, and remains dark for the rest of the time. As I understand
plasma displays, that's not how they work.

Sylvia.



I think you are mis-understanding the principles involved here in producing
a picture perceived to have smooth smear-free movement, from a sequence of
still images. Any medium which does this, needs to get the image in place as
quickly as possible, and for a time shorter than the period required to get
the next picture in place. This is true of a cinema picture, a CRT
television picture, an LCD television picture, or a plasma or OLED picture.
Making these still images into a perceived moving image, has nothing to do
with the persistence of the phosphor, but is a function of retinal
retention, or 'persistence of vision'. Black and white TV CRTs used a
phosphor blend known as 'P4', and tricolour CRTs typically used 'P22'. Both
of these are designated as being short persistence types. The green and blue
phosphors used in a colour CRT, have persistence times of typically less
than 100uS, and the red around 2 - 300uS.

The switching time of modern LCD cells is around 1 - 2 mS, and plasma cells
can switch in around 1uS. This means that the plasma cell can be switched
very quickly, and then allowed to 'burn' for as long or short a period as
the designer of the TV decides is appropriate - typically, I would think, of
the same order of time as the persistence of a P22 phosphor, thus allowing
the plasma panel to closely match the fundamental display characteristics of
a typical P22 CRT.

A good description of why the slow switching time of LCD cells is still a
problem in terms of motion blur, and what the manufacturers do to try to
overcome this, can be found at

http://en.wikipedia.org/wiki/LCD_tel...#Response_time

Arfa


  #50   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 246
Default 120hz versus 240hz

On 1/03/2010 12:17 PM, Arfa Daily wrote:
"Sylvia wrote in message
...
On 28/02/2010 12:12 AM, Arfa Daily wrote:
snip


"Sylvia wrote in message
...

Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen in
character sizes steps, the eye perceived phantom squares at intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to
follow
the square, but the square itself was moving in discrete steps. So the
eye
was causing the image of the square to be smeared across the retina. I
was
seeing this effect on a CRT screen, but the longer the persistence of
the
image on the screen the worse the effect would be. Interpolating the
position of the image on the screen would reduce that effect.

However, I can't explain why this would be less pronounced on a plasma
screen.



Because LCD cells are painfully slow at switching, which equates to a
long
persistence phosphor on a CRT, which as you say yourself, makes the
effect
worse. Plasma cells are very fast, particularly now that 'pre-fire'
techniques are used to 'ready' them for switching. This is the equivalent
of
having a very short persistence phosphor on a CRT. If you arrange for the
drive electronics to be able to deliver the cell drives with no more
delay
than the cells themselves are contributing, then the result will be
smooth
motion without any perceivable blur, which is pretty much how it was with
a
standard domestic CRT based CTV.

Arfa


It seems to me that the effect would be visible on any display that has
any degree of persistence. Even if LCDs switched instantaneously, they'd
still be displaying the image for the full frame time and then
instantaneously switching to the next image. This would produce the
smearing effect in the way I've described. To avoid it, one needs a
display that produces a short bright flash at, say, the beginning of the
display period, and remains dark for the rest of the time. As I understand
plasma displays, that's not how they work.

Sylvia.



I think you are mis-understanding the principles involved here in producing
a picture perceived to have smooth smear-free movement, from a sequence of
still images. Any medium which does this, needs to get the image in place as
quickly as possible, and for a time shorter than the period required to get
the next picture in place. This is true of a cinema picture, a CRT
television picture, an LCD television picture, or a plasma or OLED picture.
Making these still images into a perceived moving image, has nothing to do
with the persistence of the phosphor, but is a function of retinal
retention, or 'persistence of vision'.


The fact that a sequence of still images are perceived as a moving
picture is clearly a consequence of visual persistence. And it's obvious
that things will look bad if the images actually overlap. But that's not
what we're discussing.

We're discussing why certain types of display don't do such a good job
despite having a reasonably sharp transition from one image to the next.

The Wikipedia article you cited said that even LCD switching times of
2ms are not good enough "because the pixel will still be switching while
the frame is being displayed." I find this less than convicing as an
explanation. So what if the pixel is switching while the frame is being
displayed? It's not as if the eye has a shutter, and the transition time
is much less that the eye's persistence time anyway.

Sylvia.


  #51   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 57
Default 120hz versus 240hz

Arfa Daily wrote:
Making these still images into a perceived moving image, has nothing to do
with the persistence of the phosphor, but is a function of retinal
retention, or 'persistence of vision'. Black and white TV CRTs used a
phosphor blend known as 'P4', and tricolour CRTs typically used 'P22'. Both
of these are designated as being short persistence types. The green and blue
phosphors used in a colour CRT, have persistence times of typically less
than 100uS, and the red around 2 - 300uS.


Short and long persistance are relative terms. Compared to the P1 phosphors
of radar screens and osciloscopes, P4 phosphors are relatively short
persistence. Compared to an LED they are long persistance.

Note that there is a lot of "wiggle room" in there, supposedly the human
eye can only see at 24 frames per second, which is 50ms.

Also note that there are relatively few frame rates in source material,
NTSC TV is 30/1001 frames per second, PAL TV is 25. Film is 24, which was
stretched to 25 for PAL TV and reduced to 24/1001 for NTSC TV.

Film shot for direct TV distribution (MTV really did have some technological
impact) was shot at 30/1001 frames per second.

Digital TV could be any frame rate, but they have stuck with the old standards,
US digital TV is still the same frame rate as NTSC and EU, etc. digital TV is
still 25 FPS.

Lots of video files online are compressed at lower frame rates because of
the way they are shown. The screens still operate at their regular frame
rate, the computer decoding them just repeats them as necessary.

Geoff.

--
Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM
New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or
understanding, as in he has a sub-wikipedia understanding of the situation.
i.e possessing less facts or information than can be found in the Wikipedia.
  #52   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 6,772
Default 120hz versus 240hz


"Sylvia Else" wrote in message
...
On 1/03/2010 12:17 PM, Arfa Daily wrote:
"Sylvia wrote in message
...
On 28/02/2010 12:12 AM, Arfa Daily wrote:
snip


"Sylvia wrote in message
...

Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen
in
character sizes steps, the eye perceived phantom squares at
intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to
follow
the square, but the square itself was moving in discrete steps. So the
eye
was causing the image of the square to be smeared across the retina. I
was
seeing this effect on a CRT screen, but the longer the persistence of
the
image on the screen the worse the effect would be. Interpolating the
position of the image on the screen would reduce that effect.

However, I can't explain why this would be less pronounced on a plasma
screen.



Because LCD cells are painfully slow at switching, which equates to a
long
persistence phosphor on a CRT, which as you say yourself, makes the
effect
worse. Plasma cells are very fast, particularly now that 'pre-fire'
techniques are used to 'ready' them for switching. This is the
equivalent
of
having a very short persistence phosphor on a CRT. If you arrange for
the
drive electronics to be able to deliver the cell drives with no more
delay
than the cells themselves are contributing, then the result will be
smooth
motion without any perceivable blur, which is pretty much how it was
with
a
standard domestic CRT based CTV.

Arfa

It seems to me that the effect would be visible on any display that has
any degree of persistence. Even if LCDs switched instantaneously, they'd
still be displaying the image for the full frame time and then
instantaneously switching to the next image. This would produce the
smearing effect in the way I've described. To avoid it, one needs a
display that produces a short bright flash at, say, the beginning of the
display period, and remains dark for the rest of the time. As I
understand
plasma displays, that's not how they work.

Sylvia.



I think you are mis-understanding the principles involved here in
producing
a picture perceived to have smooth smear-free movement, from a sequence
of
still images. Any medium which does this, needs to get the image in place
as
quickly as possible, and for a time shorter than the period required to
get
the next picture in place. This is true of a cinema picture, a CRT
television picture, an LCD television picture, or a plasma or OLED
picture.
Making these still images into a perceived moving image, has nothing to
do
with the persistence of the phosphor, but is a function of retinal
retention, or 'persistence of vision'.


The fact that a sequence of still images are perceived as a moving picture
is clearly a consequence of visual persistence. And it's obvious that
things will look bad if the images actually overlap. But that's not what
we're discussing.

We're discussing why certain types of display don't do such a good job
despite having a reasonably sharp transition from one image to the next.

The Wikipedia article you cited said that even LCD switching times of 2ms
are not good enough "because the pixel will still be switching while the
frame is being displayed." I find this less than convicing as an
explanation. So what if the pixel is switching while the frame is being
displayed? It's not as if the eye has a shutter, and the transition time
is much less that the eye's persistence time anyway.

Sylvia.


Well, I dunno how else to put it. I'm just telling it as I was taught it
many years ago at college. I was taught that short persistence phosphors
were used on TV display CRTs, to prevent motion blur, and that the
individual images were integrated into a moving image, by persistence of
vision, not phosphor. I was also taught that the combination of POV and the
short decay time of the phosphor, led to a perceived flicker in the low
frame-rate images, so the technique of splitting each frame into two
interlaced fields, transmitted sequentially, was then born, which totally
overcame this shortcoming of the system. Always made sense to me. Always
made sense also that any replacement technology had to obey the same 'rule'
of putting the still image up very quickly, and not leaving it there long,
to achieve the same result.

If you think about it, the only 'real' difference between an LCD panel, and
a plasma panel, is the switching time of the individual elements. On the LCD
panel, this is relatively long, whereas on the plasma panel, it is short.
The LCD panel suffers from motion blur, but the plasma panel doesn't. Ergo,
it's the element switching time which causes this effect ... ??

Actually, looking into this a bit further, it seems that POV is nothing like
as simple as it would seem. I've just found another Wiki article

http://en.wikipedia.org/wiki/Persistence_of_vision

which would seem to imply that flat panel display techniques leave the first
image in place until just before the second image is ready to be put up, and
then the third, fourth and so on. If this is the case, and the reasoning
behind the fast 'frame rates' that are now being used by manufacturers, then
it would seem reasonable (to me at least) that in order for this technique
to work correctly, the element switching time would have to be as short as
possible, which would explain why a device with a switching time 1000 times
faster than another, would produce 'sharper' pictures with smoother
blur-less motion.

Arfa


  #53   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 57
Default 120hz versus 240hz

Arfa Daily wrote:
Well, I dunno how else to put it. I'm just telling it as I was taught it
many years ago at college. I was taught that short persistence phosphors
were used on TV display CRTs, to prevent motion blur, and that the
individual images were integrated into a moving image, by persistence of
vision, not phosphor. I was also taught that the combination of POV and the
short decay time of the phosphor, led to a perceived flicker in the low
frame-rate images, so the technique of splitting each frame into two
interlaced fields, transmitted sequentially, was then born, which totally
overcame this shortcoming of the system. Always made sense to me. Always
made sense also that any replacement technology had to obey the same 'rule'
of putting the still image up very quickly, and not leaving it there long,
to achieve the same result.


It's more complicated than that. You only see one image, which has been
created in your brain from several sources. The most information comes
from the rods in your eyes, they are light level (monochromatic) sensors,
as it were and they are the most prevalent. This means most of what you see
is from the combination of two sets of monochrome images with slightly
to wildly different information.

Then there are the cones, or color sensors. There are far less of them
and they are less sensitive to light, which is why night vision is black
and white.

There are also blind spots where the optic nerves attach to the retina.

None of these show up on their own, they are all integrated into the one
image you see. You never notice that you have two blind spots, you don't
notice the lack of clarity in colors (due to the fewer number of spots)
and rarely, if ever do you notice the difference between your eyes.

If you were for example to need glasses in one eye and not the other, or have
not quite properly prescibed lenses, your image will appear sharp, not blurred
on one side and sharp on the other.

Lots of tricks have been used over the years to take advantage of the
limitations of the "equipment" and the process. For example, anything faster
than 24 frames a second is not perceived as being discrete images, but one
smooth image.

The 50 and 60 fields per second (a field being half an interlaced frame) were
chosen not because they needed to be that fast (48 would have done), but to
eliminate interefence effects from electrical lights.

Color is another issue. The NTSC (and later adopted by the BBC for PAL)
determined that a 4:1 color system was good enough, i.e. color information
only needed to be changed (and recorded) at 1/4 the speed of the light level.

In modern terms, it means that for every 4 pixels, you only have to have
color information once. Your eye can resolve the difference in light levels,
but not in colors.

This persists to this day, MPEG type encoding is based on that, it's not
redgreenblue, redgreenblue, redgreenblue, redgreenblue of a still
picture or a computer screen, it's the lightlevel, lightlevel,
lightlevel, lightlevel colorforallfour encoding that was used by NTSC
and PAL.

In the end, IMHO, it's not frame rates, color encoding methods, at all, as
they were fixed around 1960 and not changed, but it is display technology as
your brain perceives it.

No matter what anyone says here, it's the combination of exact implementation
of display technology, and your brain that matter. If the combination
looks good, and you are comfortable watching it, a 25 fps CRT, or a 100FPS
LED screen, or even a 1000 FPS display, if there was such a thing, would look
good if everything combined produce good images in YOUR brain, and
bad if some combination produces something "wrong".



If you think about it, the only 'real' difference between an LCD panel, and
a plasma panel, is the switching time of the individual elements. On the LCD
panel, this is relatively long, whereas on the plasma panel, it is short.
The LCD panel suffers from motion blur, but the plasma panel doesn't. Ergo,
it's the element switching time which causes this effect ... ??


There is more to that too. An LCD is like a shutter. It pivots on its
axis and is either open or closed. Not really, there is a discrete
time from closed (black) to open (lit) and therefore a build up of
brightness.

Plasma displays are gas discharge devices, they only glow when there is enough
voltage to "fire" them until it drops below the level needed to sustain
the glow. That depends more upon the speed of the control electronics than
any (other) laws of physics, visocsity of the medium the crystals are in,
temperature, etc.

That's the aim of LED backlit TV screens (besides less power consumption, heat,
etc). They only are lit when the crystals are "open", so there is no time
where you see partially lit "pixels".

Geoff.


--
Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM
New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or
understanding, as in he has a sub-wikipedia understanding of the situation.
i.e possessing less facts or information than can be found in the Wikipedia.
  #54   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default 120hz versus 240hz

Lots of tricks have been used over the years to take advantage
of the limitations of the "equipment" and the process. For example,
anything faster than 24 frames a second is not perceived as being
discrete images, but one smooth image.


Actually, it's 16 frames a second. However, that rate is not fast enough to
prevent flicker -- which is why silent films were sometimes called
"flickers". This is one of the reasons the frame rate was increased to 24
with the introduction of sound.


The 50 and 60 fields per second (a field being half an interlaced frame)
were chosen not because they needed to be that fast (48 would have
done), but to eliminate interefence effects from electrical lights.


That's new to me.


Color is another issue. The NTSC (and later adopted by the BBC for PAL)
determined that a 4:1 color system was good enough, i.e. color information
only needed to be changed (and recorded) at 1/4 the speed of the light

level.

NTSC is actually 4.2/1.5, or roughly 2.8 to 1. PAL is closer to 5:1.


That's the aim of LED backlit TV screens (besides less power consumption,
heat, etc). They only are lit when the crystals are "open", so there is no

time
where you see partially lit "pixels".


I hate to spoil things, Geoff, but liquid crystals are quite capable of
taking intermediate positions -- that is, forming a continuous gray scale.


  #55   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 246
Default 120hz versus 240hz

On 1/03/2010 11:13 PM, William Sommerwerck wrote:
Lots of tricks have been used over the years to take advantage
of the limitations of the "equipment" and the process. For example,
anything faster than 24 frames a second is not perceived as being
discrete images, but one smooth image.


Actually, it's 16 frames a second. However, that rate is not fast enough to
prevent flicker -- which is why silent films were sometimes called
"flickers". This is one of the reasons the frame rate was increased to 24
with the introduction of sound.


The 50 and 60 fields per second (a field being half an interlaced frame)
were chosen not because they needed to be that fast (48 would have
done), but to eliminate interefence effects from electrical lights.


That's new to me.


Well, the story I heard way back when is that it was to synchronise the
picture's vertical frequency with the mains frequency, so that
inadequacies in power smoothing produced static distortions in the
picture rather than much more noticable rolling distortions.

Sylvia.


  #56   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default 120hz versus 240hz

The 50 and 60 fields per second (a field being half an interlaced frame)
were chosen not because they needed to be that fast (48 would have
done), but to eliminate interefence effects from electrical lights.


That's new to me.


Well, the story I heard way back when is that it was to synchronise
the picture's vertical frequency with the mains frequency, so that
inadequacies in power smoothing produced static distortions in the
picture rather than much more noticable rolling distortions.


That's what I heard, too. But that's not "interefence effects from
electrical lights".


  #57   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 57
Default 120hz versus 240hz

William Sommerwerck wrote:
Well, the story I heard way back when is that it was to synchronise
the picture's vertical frequency with the mains frequency, so that
inadequacies in power smoothing produced static distortions in the
picture rather than much more noticable rolling distortions.


That's what I heard, too. But that's not "interefence effects from
electrical lights".


You are assuming that all interference would be on the screen itself and none
would be visual. Since flourescent and to some extent incandescent lights
blink (what is the persistance of an incadescent light?) at 60 Hz, there is
a strobing effect if there are lights on in the room with the TV.

While some peope (me) like to watch TV in the dark, many people watch TV's
with lights on. Some manufacturers went as far as to include light sensors
in their TV sets automaticly adjusting the brightness to compensate for
room lighting as it changes.

Since some people live in places where only flourescent lights are allowed,
they have no choice if there is interference, either turn off the lights
entirely, or live with it.

I guess that could be a new tourism slogan for this summer, "Visit Israel,
and bring home real light bulbs." :-)

Geoff.

--
Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM
New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or
understanding, as in he has a sub-wikipedia understanding of the situation.
i.e possessing less facts or information than can be found in the Wikipedia.
  #58   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default 120hz versus 240hz

That's what I heard, too. But that's not "interefence effects from
electrical lights".


You are assuming that all interference would be on the screen
itself and none would be visual. Since flourescent and to some
extent incandescent lights blink (what is the persistance of an
incadescent light?) at 60 Hz, there is a strobing effect if there
are lights on in the room with the TV.


Incandescent lights have almost no flicker, due to the thermal inertia of
the filament. Fluorescent lighting was not common in living rooms at the
time the standards were set.


  #59   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 6,772
Default 120hz versus 240hz


"Geoffrey S. Mendelson" wrote in message
...
Arfa Daily wrote:
Well, I dunno how else to put it. I'm just telling it as I was taught it
many years ago at college. I was taught that short persistence phosphors
were used on TV display CRTs, to prevent motion blur, and that the
individual images were integrated into a moving image, by persistence of
vision, not phosphor. I was also taught that the combination of POV and
the
short decay time of the phosphor, led to a perceived flicker in the low
frame-rate images, so the technique of splitting each frame into two
interlaced fields, transmitted sequentially, was then born, which totally
overcame this shortcoming of the system. Always made sense to me. Always
made sense also that any replacement technology had to obey the same
'rule'
of putting the still image up very quickly, and not leaving it there
long,
to achieve the same result.


It's more complicated than that. You only see one image, which has been
created in your brain from several sources. The most information comes
from the rods in your eyes, they are light level (monochromatic) sensors,
as it were and they are the most prevalent. This means most of what you
see
is from the combination of two sets of monochrome images with slightly
to wildly different information.

Then there are the cones, or color sensors. There are far less of them
and they are less sensitive to light, which is why night vision is black
and white.

There are also blind spots where the optic nerves attach to the retina.

None of these show up on their own, they are all integrated into the one
image you see. You never notice that you have two blind spots, you don't
notice the lack of clarity in colors (due to the fewer number of spots)
and rarely, if ever do you notice the difference between your eyes.

If you were for example to need glasses in one eye and not the other, or
have
not quite properly prescibed lenses, your image will appear sharp, not
blurred
on one side and sharp on the other.

Lots of tricks have been used over the years to take advantage of the
limitations of the "equipment" and the process. For example, anything
faster
than 24 frames a second is not perceived as being discrete images, but one
smooth image.



The difference in resolution between the brightness and colour receptors in
human eyes, is well known and understood, but I don't think that this, or
any other physical aspect of the eye's construction, has any effect on the
way that motion is perceived from a series of still images.



The 50 and 60 fields per second (a field being half an interlaced frame)
were
chosen not because they needed to be that fast (48 would have done), but
to
eliminate interefence effects from electrical lights.

Color is another issue. The NTSC (and later adopted by the BBC for PAL)
determined that a 4:1 color system was good enough, i.e. color information
only needed to be changed (and recorded) at 1/4 the speed of the light
level.

In modern terms, it means that for every 4 pixels, you only have to have
color information once. Your eye can resolve the difference in light
levels,
but not in colors.

This persists to this day, MPEG type encoding is based on that, it's not
redgreenblue, redgreenblue, redgreenblue, redgreenblue of a still
picture or a computer screen, it's the lightlevel, lightlevel,
lightlevel, lightlevel colorforallfour encoding that was used by NTSC
and PAL.

In the end, IMHO, it's not frame rates, color encoding methods, at all, as
they were fixed around 1960 and not changed, but it is display technology
as
your brain perceives it.



Yes, I was not sure exactly why you were going into all of the colour
encoding issues in the context of LCD motion blur. This has nothing to do
with it. It is the display technology that is causing this. It is simply not
as good as other technologies in this respect, despite all of the efforts of
the manufacturers to make it otherwise ...



No matter what anyone says here, it's the combination of exact
implementation
of display technology, and your brain that matter. If the combination
looks good, and you are comfortable watching it, a 25 fps CRT, or a 100FPS
LED screen, or even a 1000 FPS display, if there was such a thing, would
look
good if everything combined produce good images in YOUR brain, and
bad if some combination produces something "wrong".



But this isn't so. A crap picture may, I agree, look 'ok' to someone who
knows no better, but that doesn't alter the fact that it is still a crap
picture that those who *do* know better, will see for what it is. LCD panels
produce crap images in terms of motion blur, and when compared for this
effect to CRTs, plasma panels, and OLEDs.





If you think about it, the only 'real' difference between an LCD panel,
and
a plasma panel, is the switching time of the individual elements. On the
LCD
panel, this is relatively long, whereas on the plasma panel, it is short.
The LCD panel suffers from motion blur, but the plasma panel doesn't.
Ergo,
it's the element switching time which causes this effect ... ??


There is more to that too. An LCD is like a shutter. It pivots on its
axis and is either open or closed. Not really, there is a discrete
time from closed (black) to open (lit) and therefore a build up of
brightness.



I was talking in terms of the fundamental visual principle in that they are
both matrixed cell-based displays requiring similar frame buffering and
driving techniques in signal terms. I was not referring to the way that each
technology actually produces coloured light from the individual cells, which
is clearly entirely different in both cases, from the raster based CRT
principle which, like plasma panels, doesn't suffer from motion blur.



Plasma displays are gas discharge devices, they only glow when there is
enough
voltage to "fire" them until it drops below the level needed to sustain
the glow. That depends more upon the speed of the control electronics than
any (other) laws of physics, visocsity of the medium the crystals are in,
temperature, etc.




It doesn't really rely on the speed of the drive electronics since there are
techniques used to bring the plasma cells to a 'pre-fire' condition just
below the point at which the gas actually ionises. This allows the cells to
be fired with a small drive voltage, and without having to wait for the cell
to build up to the point where it actually fires. This is how they can get
the switching speed of the cells down to as little as 1uS



That's the aim of LED backlit TV screens (besides less power consumption,
heat,
etc). They only are lit when the crystals are "open", so there is no time
where you see partially lit "pixels".

Geoff.


Hmmm. That's not the way I've seen it described. Most of the hype about this
development seems to concentrate on producing dynamic contrast enhancement
by modulating the LEDs' brightness in an area-specific way, depending on the
picture content in front of them.

Arfa


  #60   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default 120hz versus 240hz

But this isn't so. A crap picture may, I agree, look 'ok' to someone
who knows no better, but that doesn't alter the fact that it is still a
crap picture that those who *do* know better, will see for what it is.
LCD panels produce crap images in terms of motion blur, and when
compared for this effect to CRTs, plasma panels, and OLEDs.


I've seen plasma and oled. I can see differences between those and
standard LCD panels but not in my wildest dreams would I call them
crap. Most of my viewing is done in standard 480P 4:3 aspect cropped
to fill the screen. I don't need a high dollar plasma set for that, it
would be overkill. I own a Sony 720i/1080i HDMI upscaling DVDR that
produces sharp clear video. Once in a while I do notice a scan wave
because of the upscaling but the grand scheme of things make those
things very forgettable.


The 32" Vizio LCD in my den has a very wide viewing angle and does not show
significant smearing or blurring with rapid motion. (I paid about $380 for
it.)

With respect to scaling... People here and elsewhere have said they see no
point to Blu-ray disks, as they see little or no difference with upscaled
DVDs. Ergo, Blu-rays are a ripoff. I watched the Blu-ray of "The Sixth
Sense" yesterday, which threw this issue into sharp perspective.

The transfer is typical Disney -- extremely sharp and detailed, with rich
colors. It's close to demo quality.

Some of the supplemental material includes scenes from the Blu-ray transfer
that have been letterboxed into a 4:3 image. (Got that?) When I select ZOOM
on my Kuro, that section is blown up to full screen. ("The Sixth Sense" was
shot at 1.85:1.) Viewing at these images in isolation -- they look fine.
They're slightly soft, but one might believe it's the fault of the source
material. They don't look upscaled -- until you compare them with
full-resolution Blu-ray. There is no comparison!




  #61   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 43,017
Default 120hz versus 240hz

In article ,
Sylvia Else wrote:
On 1/03/2010 11:13 PM, William Sommerwerck wrote:
Lots of tricks have been used over the years to take advantage
of the limitations of the "equipment" and the process. For example,
anything faster than 24 frames a second is not perceived as being
discrete images, but one smooth image.


Actually, it's 16 frames a second. However, that rate is not fast
enough to prevent flicker -- which is why silent films were sometimes
called "flickers". This is one of the reasons the frame rate was
increased to 24 with the introduction of sound.


The 50 and 60 fields per second (a field being half an interlaced
frame) were chosen not because they needed to be that fast (48 would
have done), but to eliminate interefence effects from electrical
lights.


That's new to me.


Well, the story I heard way back when is that it was to synchronise the
picture's vertical frequency with the mains frequency, so that
inadequacies in power smoothing produced static distortions in the
picture rather than much more noticable rolling distortions.


UK TV went off mains lock many many years ago. Something like the early
'60s, - before colour arrived here. When sets were still valve.

--
*Always drink upstream from the herd *

Dave Plowman London SW
To e-mail, change noise into sound.
  #62   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 12,924
Default 120hz versus 240hz


William Sommerwerck wrote:

The 50 and 60 fields per second (a field being half an interlaced frame)
were chosen not because they needed to be that fast (48 would have
done), but to eliminate interefence effects from electrical lights.


That's new to me.


Well, the story I heard way back when is that it was to synchronise
the picture's vertical frequency with the mains frequency, so that
inadequacies in power smoothing produced static distortions in the
picture rather than much more noticable rolling distortions.


That's what I heard, too. But that's not "interefence effects from
electrical lights".



60 Hz ws used in the US to prevent hum bars from rolling up or down
the screen due to the difference in the line & scan frequencies. A faint
bar would be hard to spot if it was not moving, but very annoying if it
did. People have to remember that the standards were set when
Electronics was fairly new, and rather crude designs were used.


--
Greed is the root of all eBay.
  #63   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 15
Default 120hz versus 240hz


"Arfa Daily" wrote in message
...

"Geoffrey S. Mendelson" wrote in message
...
Arfa Daily wrote:
Well, I dunno how else to put it. I'm just telling it as I was taught it
many years ago at college. I was taught that short persistence phosphors
were used on TV display CRTs, to prevent motion blur, and that the
individual images were integrated into a moving image, by persistence of
vision, not phosphor. I was also taught that the combination of POV and
the
short decay time of the phosphor, led to a perceived flicker in the low
frame-rate images, so the technique of splitting each frame into two
interlaced fields, transmitted sequentially, was then born, which
totally
overcame this shortcoming of the system. Always made sense to me. Always
made sense also that any replacement technology had to obey the same
'rule'
of putting the still image up very quickly, and not leaving it there
long,
to achieve the same result.


It's more complicated than that. You only see one image, which has been
created in your brain from several sources. The most information comes
from the rods in your eyes, they are light level (monochromatic) sensors,
as it were and they are the most prevalent. This means most of what you
see
is from the combination of two sets of monochrome images with slightly
to wildly different information.

Then there are the cones, or color sensors. There are far less of them
and they are less sensitive to light, which is why night vision is black
and white.

There are also blind spots where the optic nerves attach to the retina.

None of these show up on their own, they are all integrated into the one
image you see. You never notice that you have two blind spots, you don't
notice the lack of clarity in colors (due to the fewer number of spots)
and rarely, if ever do you notice the difference between your eyes.

If you were for example to need glasses in one eye and not the other, or
have
not quite properly prescibed lenses, your image will appear sharp, not
blurred
on one side and sharp on the other.

Lots of tricks have been used over the years to take advantage of the
limitations of the "equipment" and the process. For example, anything
faster
than 24 frames a second is not perceived as being discrete images, but
one
smooth image.



The difference in resolution between the brightness and colour receptors
in human eyes, is well known and understood, but I don't think that this,
or any other physical aspect of the eye's construction, has any effect on
the way that motion is perceived from a series of still images.



The 50 and 60 fields per second (a field being half an interlaced frame)
were
chosen not because they needed to be that fast (48 would have done), but
to
eliminate interefence effects from electrical lights.

Color is another issue. The NTSC (and later adopted by the BBC for PAL)
determined that a 4:1 color system was good enough, i.e. color
information
only needed to be changed (and recorded) at 1/4 the speed of the light
level.

In modern terms, it means that for every 4 pixels, you only have to have
color information once. Your eye can resolve the difference in light
levels,
but not in colors.

This persists to this day, MPEG type encoding is based on that, it's not
redgreenblue, redgreenblue, redgreenblue, redgreenblue of a still
picture or a computer screen, it's the lightlevel, lightlevel,
lightlevel, lightlevel colorforallfour encoding that was used by NTSC
and PAL.

In the end, IMHO, it's not frame rates, color encoding methods, at all,
as
they were fixed around 1960 and not changed, but it is display technology
as
your brain perceives it.



Yes, I was not sure exactly why you were going into all of the colour
encoding issues in the context of LCD motion blur. This has nothing to do
with it. It is the display technology that is causing this. It is simply
not as good as other technologies in this respect, despite all of the
efforts of the manufacturers to make it otherwise ...



No matter what anyone says here, it's the combination of exact
implementation
of display technology, and your brain that matter. If the combination
looks good, and you are comfortable watching it, a 25 fps CRT, or a
100FPS
LED screen, or even a 1000 FPS display, if there was such a thing, would
look
good if everything combined produce good images in YOUR brain, and
bad if some combination produces something "wrong".



But this isn't so. A crap picture may, I agree, look 'ok' to someone who
knows no better, but that doesn't alter the fact that it is still a crap
picture that those who *do* know better, will see for what it is. LCD
panels produce crap images in terms of motion blur, and when compared for
this effect to CRTs, plasma panels, and OLEDs.





If you think about it, the only 'real' difference between an LCD panel,
and
a plasma panel, is the switching time of the individual elements. On the
LCD
panel, this is relatively long, whereas on the plasma panel, it is
short.
The LCD panel suffers from motion blur, but the plasma panel doesn't.
Ergo,
it's the element switching time which causes this effect ... ??


There is more to that too. An LCD is like a shutter. It pivots on its
axis and is either open or closed. Not really, there is a discrete
time from closed (black) to open (lit) and therefore a build up of
brightness.



I was talking in terms of the fundamental visual principle in that they
are both matrixed cell-based displays requiring similar frame buffering
and driving techniques in signal terms. I was not referring to the way
that each technology actually produces coloured light from the individual
cells, which is clearly entirely different in both cases, from the raster
based CRT principle which, like plasma panels, doesn't suffer from motion
blur.



Plasma displays are gas discharge devices, they only glow when there is
enough
voltage to "fire" them until it drops below the level needed to sustain
the glow. That depends more upon the speed of the control electronics
than
any (other) laws of physics, visocsity of the medium the crystals are in,
temperature, etc.




It doesn't really rely on the speed of the drive electronics since there
are techniques used to bring the plasma cells to a 'pre-fire' condition
just below the point at which the gas actually ionises. This allows the
cells to be fired with a small drive voltage, and without having to wait
for the cell to build up to the point where it actually fires. This is how
they can get the switching speed of the cells down to as little as 1uS



That's the aim of LED backlit TV screens (besides less power consumption,
heat,
etc). They only are lit when the crystals are "open", so there is no time
where you see partially lit "pixels".

Geoff.


Hmmm. That's not the way I've seen it described. Most of the hype about
this development seems to concentrate on producing dynamic contrast
enhancement by modulating the LEDs' brightness in an area-specific way,
depending on the picture content in front of them.

Arfa


The way it was described to me is there are seveal hundred LEDs which are
each assigned specific "areas" of the screen. So it would seem that if you
have a bright AND dark area within an individual LED's jurisdiction, there
would be some sort of conflict. Unless, of course, such jurisdictions are
actually blended into the others. But they would still have to average their
brilliance. Either way, I could see how there would be a contrast
improvement across the screen as a whole since more lights is always better
than ONE.



  #64   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 6,772
Default 120hz versus 240hz


"Mark Zenier" wrote in message
...
In article ,
Arfa Daily wrote:
You should also be aware that there are several 'resolutions' of screen
and
drive to take into consideration. Almost all TV showrooms both here and in
the US, tend to have the sets running on at least an HD picture, and often
a
BluRay picture. This makes them look very good at first glance. Problem is
that in normal day to day use when you get it back home, you are going to
be
watching standard resolution terrestrial broadcasts on it, and on many
sets,
these look pretty dreadful, and it is the reason that so many people are
disappointed with their purchase when they get it home, and think that it
is
not what they saw in the store.


At least in my area (Seattle), all the mainstream over the air stations
are HD, now. (Each station has its own transmitter, so they don't have
shared "multiplexes" like in the UK). The typical situation is an
HD subchannel with the main program and one SD subchannel with some
secondary service (sports, weather, old movies, old TV show, or an
SD copy of the main signal to feed analog cable).

There's a smaller number of stations that trade selection for resolution,
running four or five SD subchannels. Christian broadcasting and
speciality stuff like the secondary channels on the other stations.

The only way you'd get stuck with standard definition on the national
networks is to still be using analog cable. (I'm not familiar with
what you get with the two (subscription only) national direct broadcast
satellite providers).

Mark Zenier
Googleproofaddress(account:mzenier provider:eskimo domain:com)



So what band are we talking here ? Are these UHF digital transmissions ? How
many OTA HD channels would you typically have available in any given area ?
Do you know what compression scheme they are using ?

The digital terrestrial TV being provided here in the UK now, currently
carries no HD content, despite ongoing promises. This is due to some extent
on the government reneging on a promise to make more of the UHF band
available to the broadcasters. Having now told them that they can't have any
more, and the broadcasters having already filled up what they have got
available with multiplexes carrying 'proper' channels and crap channels in a
ratio of about 1 to 5, the only option that they are now left with is to use
another different and non standard variant of mpeg 4 compression.

The situation via direct broadcast satellite is much clearer. Here, they
have so much bandwidth available that they are able to carry many HD
channels, so this is where people here get their HD content from.
Unfortunately, the satellite operator charges us another tenner ($15) a
month for the privilege of receiving these transmissions ... :-(

Arfa


  #65   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 1,716
Default 120hz versus 240hz


"Geoffrey S. Mendelson"

The 50 and 60 fields per second (a field being half an interlaced frame)
were
chosen not because they needed to be that fast (48 would have done), but
to
eliminate interefence effects from electrical lights.



** But the lights concerned were those being used to illuminate the TV
studio.

When frame rates are not locked to the AC supply frequency, faint shadows
can be seen moving up or down studio images on a monitor or home TV set -
due to the twice per cycle dip in brightness of incandescent lamps.

Other fixes include using lamps with sufficient thermal inertia or groups of
lamps on different phases to eliminate the light modulation.



...... Phil




  #66   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 43,017
Default 120hz versus 240hz

In article ,
Arfa Daily wrote:
The digital terrestrial TV being provided here in the UK now, currently
carries no HD content, despite ongoing promises.


Not so. BBC HD is transmitted on FreeView as is ITV HD. CH4 and 5 will be
added shortly. This is from the London transmitter. Not sure about
everywhere.

--
*Who are these kids and why are they calling me Mom?

Dave Plowman London SW
To e-mail, change noise into sound.
  #67   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 43,017
Default 120hz versus 240hz

In article ,
Phil Allison wrote:

"Geoffrey S. Mendelson"

The 50 and 60 fields per second (a field being half an interlaced
frame) were chosen not because they needed to be that fast (48 would
have done), but to eliminate interefence effects from electrical
lights.



** But the lights concerned were those being used to illuminate the TV
studio.


Studio luminaries are commonly filament lamps. To allow easy control of
level, and because of their continuous spectrum light output.

When frame rates are not locked to the AC supply frequency, faint
shadows can be seen moving up or down studio images on a monitor or
home TV set - due to the twice per cycle dip in brightness of
incandescent lamps.


In the UK TV hasn't been mains locked for about 40 years. I'd guess other
countries the same. The mains frequency varies too much for modern
requirements.

Other fixes include using lamps with sufficient thermal inertia or
groups of lamps on different phases to eliminate the light modulation.


Fluorescent types are used on location these days, but use high frequency
ballasts. HID types don't run at mains frequency either.

Only time I've seen a phased array used was for a boxing ring - before
high frequency ballasts became common.

--
*I pretend to work. - they pretend to pay me.

Dave Plowman London SW
To e-mail, change noise into sound.
  #68   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 6,772
Default 120hz versus 240hz


"Dave Plowman (News)" wrote in message
...
In article ,
Arfa Daily wrote:
The digital terrestrial TV being provided here in the UK now, currently
carries no HD content, despite ongoing promises.


Not so. BBC HD is transmitted on FreeView as is ITV HD. CH4 and 5 will be
added shortly. This is from the London transmitter. Not sure about
everywhere.

--
*Who are these kids and why are they calling me Mom?

Dave Plowman London SW
To e-mail, change noise into sound.


None available on FV here in my east midlands location. Just looked at my
"TV Times" (national) listings mag, and it claims that BBC HD is available
on Freesat CH 108, Sky CH 143 and Virgin Cable CH 108. Likewise, it says
that ITV HD is only available on Freesat via the 'red button' service. In
any case, BBC HD is hardly a useful service, as they just stick a mixture of
their total network output on there at random times. I was recording
"Survivors" on BBC HD via sat on series link. Suddenly, the series finale
has disappeared from the recording list. I check the schedules, and it's
just not on there. Some random olympics programme or something. So I hastily
set it to record on SD BBC. Then, a couple of days later, it randomly
appears again on BBC HD at some obscure time when they had a slot to fill.
ITV HD, from what I've seen of it on the Freesat service, seems to be just
for football matches, once in a while. Either service is hardly inspiring
for people with HD TV sets and a built-in DTTV tuner, as most have.

So I would have to conclude that at the moment, the London area is possibly
unique in carrying these services. Just as a matter of interest, what
equipment is required to receive these FreeView HD transmissions, and has
the compression scheme now been finalised then, to allow manufacturers to
produce necessary equipment in bulk ?

Interesting that you say that CH5 is shortly going to be placing HD content
onto FreeView. At the moment, they have no HD output at all, and I would
have thought that if they were about to start, then the first places would
have been on the Sky satellite service, and Virgin cable, where there is an
existing customer base, with fully operational equipment to allow them to
access and view the service.

Channel Four I can understand wanting to provide a FreeView service as they
already produce an HD mirror of their SD service on Sky and Virgin.

Just as a matter of interest, do you know what cameras they use for
producing their HD content (or their programme makers / suppliers) ? Just
that their HD output is stunningly good compared to some other efforts by
other stations. And I'm talking original 'native' HD here, not just content
that was shot in standard res, and then placed on the station's HD channel.
Taking, for instance, Phil and Kirsty's "Relocation, Relocation" (Wednesday
8pm) programme on CH4. The image quality is absolutely cracking, and
everything you would expect HD to be. Likewise, "Extreme Engineering" on
NatGeo I think it is, and "American Chopper" on Discovery. OTOH, "Lost" and
"24" from Sky 1 both claim to be 'originals' in HD format, but although they
look better in HD than they do in SD, they still seem to lack that
'pin-sharp' quality that the other programmes I've cited, have. As you are
'in the business' so to speak, just wondered if you had any insights into
this ?

Arfa


  #69   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 57
Default 120hz versus 240hz

Arfa Daily wrote:

The digital terrestrial TV being provided here in the UK now, currently
carries no HD content, despite ongoing promises. This is due to some extent
on the government reneging on a promise to make more of the UHF band
available to the broadcasters. Having now told them that they can't have any
more, and the broadcasters having already filled up what they have got
available with multiplexes carrying 'proper' channels and crap channels in a
ratio of about 1 to 5, the only option that they are now left with is to use
another different and non standard variant of mpeg 4 compression.


It's not nonstandard. MPEG4 is one of those "evolving standards", so that
they can sell you a decoder box or TV that supports the current variants
and next week turn around and sell you a new one.

Or if you have a computer, provide a firmware update.

It gets rid of the problem that CRT TVs had, they did not change fast enough
to get people buying new ones in a fast enough cycle to keep the companies
in business.

I have a spare TV that I bought in 1986 and AFAIK, it still works. We have
not yet switched to digital over the air here (Israel).

Speaking of MPEG4, Israel chose H.264 with AAC audio, a combination no one
had ever used before. The idea was to squeeze as many regular (520p 4:3)
channels in one 8mHz DVD-T channel.


The situation via direct broadcast satellite is much clearer. Here, they
have so much bandwidth available that they are able to carry many HD
channels, so this is where people here get their HD content from.
Unfortunately, the satellite operator charges us another tenner ($15) a
month for the privilege of receiving these transmissions ... :-(


Same here, but it's 40 NIS ($25).

BTW, where do those HDTV BBC programs come from? They are not over the air?

Geoff.
--
Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM
New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or
understanding, as in he has a sub-wikipedia understanding of the situation.
i.e possessing less facts or information than can be found in the Wikipedia.
  #70   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 1,716
Default WRONG CONTEXT you STUPID MORON


"Dave Plowman (****ing Nut Case Pommy **** )

Studio luminaries are commonly filament lamps.


** DUUUUUHHHHHHHHHH !!!!!!!!!!

WRONG CONTEXT - you ****ing STUPID MORON !



In the UK TV hasn't been mains locked for about 40 years.


** WRONG CONTEXT - you ****ing STUPID MORON !


Fluorescent types are used on location these days,



** WRONG CONTEXT - you ****ing STUPID MORON !


Only time I've seen a phased array used was for a boxing ring



** WRONG CONTEXT - you ****ing STUPID MORON !

Someone PLEEEEASE go and SHOOT this imbecile through the head !!!



..... Phil






  #71   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 6,772
Default 120hz versus 240hz


"Geoffrey S. Mendelson" wrote in message
...
Arfa Daily wrote:

The digital terrestrial TV being provided here in the UK now, currently
carries no HD content, despite ongoing promises. This is due to some
extent
on the government reneging on a promise to make more of the UHF band
available to the broadcasters. Having now told them that they can't have
any
more, and the broadcasters having already filled up what they have got
available with multiplexes carrying 'proper' channels and crap channels
in a
ratio of about 1 to 5, the only option that they are now left with is to
use
another different and non standard variant of mpeg 4 compression.


It's not nonstandard. MPEG4 is one of those "evolving standards", so that
they can sell you a decoder box or TV that supports the current variants
and next week turn around and sell you a new one.

Or if you have a computer, provide a firmware update.

It gets rid of the problem that CRT TVs had, they did not change fast
enough
to get people buying new ones in a fast enough cycle to keep the companies
in business.

I have a spare TV that I bought in 1986 and AFAIK, it still works. We have
not yet switched to digital over the air here (Israel).

Speaking of MPEG4, Israel chose H.264 with AAC audio, a combination no one
had ever used before. The idea was to squeeze as many regular (520p 4:3)
channels in one 8mHz DVD-T channel.


The situation via direct broadcast satellite is much clearer. Here, they
have so much bandwidth available that they are able to carry many HD
channels, so this is where people here get their HD content from.
Unfortunately, the satellite operator charges us another tenner ($15) a
month for the privilege of receiving these transmissions ... :-(


Same here, but it's 40 NIS ($25).

BTW, where do those HDTV BBC programs come from? They are not over the
air?

Geoff.
--


BBC HD is currently available by direct broadcast satellite, and from the
Virgin cable service. I receive it via the former. It would seem that in a
few areas, it is now available via the FreeView DTTV service which is
replacing our current analogue service over the next couple of years.
However, although I receive FreeView from one of the 'main' national
transmitter sites, the FreeView HD service will not be available to me for
some long time yet, according to

http://www.radioandtelly.co.uk/freeviewhd.html

A different DTTV receiver is required, and it looks as though the only one
currently available is 180 quid ($270 ish). I can't see many people wanting
to hang yet another receiver on the end of their 'HD Ready' TV sets, for
that sort of money, and to receive just a few HD services. There is never
going to be the bandwidth available to put more on there, alongside the
other services.

I'm not sure what exactly you mean by "an evolving standard". That seems an
oxymoron if ever I heard one. Either it's a standard, or it's an evolving
system. It can't be both. The sat broadcasters have been using the same
transmission standards for years, and don't seem to suffer problems with
compatibility of receiving equipment. The DTTV service, OTOH, seems to be a
mish-mash compromise system, which has changed 'standards' and names several
times, in an effort to make it do what was, in truth, never going to be
practically possible ...

Arfa


  #72   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,833
Default 120hz versus 240hz

I'm not sure what exactly you mean by "an evolving standard".
That seems an oxymoron if ever I heard one. Either it's a standard,
or it's an evolving system. It can't be both.


To the best of my understanding, all audio and video codecs carry with them
the information need to correctly decode the transmission. This allows (for
example) DVDs and Blu-rays to use varying bitrates and different codecs. (If
this isn't right, please correct me.)


  #73   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 43,017
Default 120hz versus 240hz

In article ,
Arfa Daily wrote:

"Dave Plowman (News)" wrote in message
...
In article , Arfa Daily
wrote:
The digital terrestrial TV being provided here in the UK now,
currently carries no HD content, despite ongoing promises.


Not so. BBC HD is transmitted on FreeView as is ITV HD. CH4 and 5 will
be added shortly. This is from the London transmitter. Not sure about
everywhere.


None available on FV here in my east midlands location.


They're really just tests at the moment. The full HD service should be
available by the Olympics next year.

Just looked at
my "TV Times" (national) listings mag, and it claims that BBC HD is
available on Freesat CH 108, Sky CH 143 and Virgin Cable CH 108.
Likewise, it says that ITV HD is only available on Freesat via the 'red
button' service.


Well, I don't have a FreeSat receiver but get ITV HD off satellite.

In any case, BBC HD is hardly a useful service, as
they just stick a mixture of their total network output on there at
random times. I was recording "Survivors" on BBC HD via sat on series
link. Suddenly, the series finale has disappeared from the recording
list. I check the schedules, and it's just not on there. Some random
olympics programme or something. So I hastily set it to record on SD
BBC. Then, a couple of days later, it randomly appears again on BBC HD
at some obscure time when they had a slot to fill.


With just the one HD service, choices will be made.

ITV HD, from what
I've seen of it on the Freesat service, seems to be just for football
matches, once in a while.


Some dramas too. The most regular being The Bill.

Either service is hardly inspiring for people
with HD TV sets and a built-in DTTV tuner, as most have.


I'd be surprised if many have an HD set with a built in HD tuner - they've
only just been announced. And an ordinary Freeview tuner won't get HD.

So I would have to conclude that at the moment, the London area is
possibly unique in carrying these services. Just as a matter of
interest, what equipment is required to receive these FreeView HD
transmissions, and has the compression scheme now been finalised then,
to allow manufacturers to produce necessary equipment in bulk ?


FreeView HD tuners are on the market. But I'll not get one until there's a
HD PVR at a reasonable price.

Interesting that you say that CH5 is shortly going to be placing HD
content onto FreeView. At the moment, they have no HD output at all,
and I would have thought that if they were about to start, then the
first places would have been on the Sky satellite service, and Virgin
cable, where there is an existing customer base, with fully operational
equipment to allow them to access and view the service.


The relationship between Sky and its audience is based on making Sky
money. Nothing to do with providing a service.

Channel Four I can understand wanting to provide a FreeView service as
they already produce an HD mirror of their SD service on Sky and Virgin.


Just as a matter of interest, do you know what cameras they use for
producing their HD content (or their programme makers / suppliers) ?
Just that their HD output is stunningly good compared to some other
efforts by other stations. And I'm talking original 'native' HD here,
not just content that was shot in standard res, and then placed on the
station's HD channel. Taking, for instance, Phil and Kirsty's
"Relocation, Relocation" (Wednesday 8pm) programme on CH4. The image
quality is absolutely cracking, and everything you would expect HD to
be. Likewise, "Extreme Engineering" on NatGeo I think it is, and
"American Chopper" on Discovery. OTOH, "Lost" and "24" from Sky 1 both
claim to be 'originals' in HD format, but although they look better in
HD than they do in SD, they still seem to lack that 'pin-sharp' quality
that the other programmes I've cited, have. As you are 'in the
business' so to speak, just wondered if you had any insights into this ?


I'm not sure what gear is used for the progs you mention. But the snag is
some still use filters to soften the image - especially with 'talent' of a
certain age who don't want every wrinkle to show. Near always on drama.
However, anything shot outdoors to look good is likely to look
particularly sharp due to the light levels than drama, etc.

The HD drama I work on uses Thompson cameras and is recorded on Panasonic
P2 - memory card based system. The pictures are quite superb - until the
fog filters go in. ;-(

--
*Corduroy pillows are making headlines.

Dave Plowman London SW
To e-mail, change noise into sound.
  #74   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 57
Default 120hz versus 240hz

William Sommerwerck wrote:

To the best of my understanding, all audio and video codecs carry with them
the information need to correctly decode the transmission. This allows (for
example) DVDs and Blu-rays to use varying bitrates and different codecs. (If
this isn't right, please correct me.)


No. It's much more complicated than that. AVI files carry imformation about the
file, such as a codec number each for audio and video, the bit rate,
the frame rate, number of audio channels, and so on.

Satellite (and DBS) data feeds contain some information, some feeds contain
none at all.

DVD's, Blu-Ray, VCD's, etc, all have a very specific format. DVD's are also
limited to MPEG-2 video encoding (with a limited range of resolutions, frame
rates, etc.) They also have a very limited range of audio encoding.

Sometimes it amazes me that a program such as mplayer or VLC can play a
random file and it works.

The reason the Chinese DVD players can play so many files now is that they
either use the freeware Linux based player, Mplayer, or the proprietary
clone of it written in a language for embedded systems.

Just as an example, someone gave me a sample of the files created by their
DVB-T TV decoder. They are raw MPEG-TS files, encoded with H.264 and AAC.
Nothing I have can open them. :-(

Geoff.

--
Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM
New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or
understanding, as in he has a sub-wikipedia understanding of the situation.
i.e possessing less facts or information than can be found in the Wikipedia.
  #75   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,103
Default 120hz versus 240hz

"Arfa Daily" wrote in
:


"Geoffrey S. Mendelson" wrote in message
...
Arfa Daily wrote:

The digital terrestrial TV being provided here in the UK now,
currently carries no HD content, despite ongoing promises. This is
due to some extent
on the government reneging on a promise to make more of the UHF band
available to the broadcasters. Having now told them that they can't
have any
more, and the broadcasters having already filled up what they have
got available with multiplexes carrying 'proper' channels and crap
channels in a
ratio of about 1 to 5, the only option that they are now left with
is to use
another different and non standard variant of mpeg 4 compression.


It's not nonstandard. MPEG4 is one of those "evolving standards", so
that they can sell you a decoder box or TV that supports the current
variants and next week turn around and sell you a new one.

Or if you have a computer, provide a firmware update.

It gets rid of the problem that CRT TVs had, they did not change fast
enough
to get people buying new ones in a fast enough cycle to keep the
companies in business.

I have a spare TV that I bought in 1986 and AFAIK, it still works. We
have not yet switched to digital over the air here (Israel).

Speaking of MPEG4, Israel chose H.264 with AAC audio, a combination
no one had ever used before. The idea was to squeeze as many regular
(520p 4:3) channels in one 8mHz DVD-T channel.


The situation via direct broadcast satellite is much clearer. Here,
they have so much bandwidth available that they are able to carry
many HD channels, so this is where people here get their HD content
from. Unfortunately, the satellite operator charges us another
tenner ($15) a month for the privilege of receiving these
transmissions ... :-(


Same here, but it's 40 NIS ($25).

BTW, where do those HDTV BBC programs come from? They are not over
the air?

Geoff.
--


BBC HD is currently available by direct broadcast satellite, and from
the Virgin cable service. I receive it via the former. It would seem
that in a few areas, it is now available via the FreeView DTTV service
which is replacing our current analogue service over the next couple
of years. However, although I receive FreeView from one of the 'main'
national transmitter sites, the FreeView HD service will not be
available to me for some long time yet, according to

http://www.radioandtelly.co.uk/freeviewhd.html

A different DTTV receiver is required, and it looks as though the only
one currently available is 180 quid ($270 ish). I can't see many
people wanting to hang yet another receiver on the end of their 'HD
Ready' TV sets, for that sort of money, and to receive just a few HD
services. There is never going to be the bandwidth available to put
more on there, alongside the other services.

I'm not sure what exactly you mean by "an evolving standard". That
seems an oxymoron if ever I heard one. Either it's a standard, or it's
an evolving system. It can't be both. The sat broadcasters have been
using the same transmission standards for years, and don't seem to
suffer problems with compatibility of receiving equipment. The DTTV
service, OTOH, seems to be a mish-mash compromise system, which has
changed 'standards' and names several times, in an effort to make it
do what was, in truth, never going to be practically possible ...

Arfa




some PC card interfaces were "standards",yet evolved;
like ISA evolved into EISA,PCI into PCI-E.

And VGA evolved into Super VGA.

Then there's Linux...... ;-)

--
Jim Yanik
jyanik
at
localnet
dot com


  #76   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 98
Default WRONG CONTEXT you STUPID MORON

On Wed, 3 Mar 2010 00:02:50 +1100, "Phil Allison"
wrote:


"Dave Plowman (****ing Nut Case Pommy **** )

Studio luminaries are commonly filament lamps.


** DUUUUUHHHHHHHHHH !!!!!!!!!!

WRONG CONTEXT - you ****ing STUPID MORON !



In the UK TV hasn't been mains locked for about 40 years.


** WRONG CONTEXT - you ****ing STUPID MORON !


Fluorescent types are used on location these days,



** WRONG CONTEXT - you ****ing STUPID MORON !


Only time I've seen a phased array used was for a boxing ring



** WRONG CONTEXT - you ****ing STUPID MORON !

Someone PLEEEEASE go and SHOOT this imbecile through the head !!!



.... Phil



You can do that yourself. The concept is simple - insert the muzzle
of the pistol in your ear and pull the trigger. If you are too stupid
to understand the concept, let me know, I'll gladly break it down into
a series of steps even you can follow.

PlainBill
  #77   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 81
Default 120hz versus 240hz

In article ,
Arfa Daily wrote:

"Mark Zenier" wrote in message
...
In article ,
Arfa Daily wrote:
You should also be aware that there are several 'resolutions' of screen
and
drive to take into consideration. Almost all TV showrooms both here and in
the US, tend to have the sets running on at least an HD picture, and often
a
BluRay picture. This makes them look very good at first glance. Problem is
that in normal day to day use when you get it back home, you are going to
be
watching standard resolution terrestrial broadcasts on it, and on many
sets,
these look pretty dreadful, and it is the reason that so many people are
disappointed with their purchase when they get it home, and think that it
is
not what they saw in the store.


At least in my area (Seattle), all the mainstream over the air stations
are HD, now. (Each station has its own transmitter, so they don't have
shared "multiplexes" like in the UK). The typical situation is an
HD subchannel with the main program and one SD subchannel with some
secondary service (sports, weather, old movies, old TV show, or an
SD copy of the main signal to feed analog cable).

There's a smaller number of stations that trade selection for resolution,
running four or five SD subchannels. Christian broadcasting and
speciality stuff like the secondary channels on the other stations.

The only way you'd get stuck with standard definition on the national
networks is to still be using analog cable. (I'm not familiar with
what you get with the two (subscription only) national direct broadcast
satellite providers).


So what band are we talking here ? Are these UHF digital transmissions ? How
many OTA HD channels would you typically have available in any given area ?
Do you know what compression scheme they are using ?


Both VHF-High (channels 7-13, 170something to 220? MHz) (3 stations, here)
and UHF (channels 14-51(?), around 500-700 MHz) (another 10, here).
There are some VHF-low band stations in other parts of the country
but I gather that 54-88 MHz has real problems with thunderstorms and
interference.

The US channels are all 6 MHz wide, both UHF and VHF. ATSC using 8VSB
with something around 19 MBPS, using MPEG-3. As I understand it, HD
will use about 12 MBPS. The over the cable version uses a different
modulation, [mumble]-QAM, and has twice the number of bits per second.
HD in the case of ATSC may be only 720p, or 1080i.

I can't get them all, (They're clustered in 5 different locations),
but at least 8 (maybe 10) are in HD.

The bucks from auctioning off channels 52(?) to 69 to the cell phone
and wireless companies is what got the government to push this through.

The digital terrestrial TV being provided here in the UK now, currently
carries no HD content, despite ongoing promises. This is due to some extent
on the government reneging on a promise to make more of the UHF band
available to the broadcasters. Having now told them that they can't have any
more, and the broadcasters having already filled up what they have got
available with multiplexes carrying 'proper' channels and crap channels in a
ratio of about 1 to 5, the only option that they are now left with is to use
another different and non standard variant of mpeg 4 compression.


Around here, since nobody had to be nice and share, they just toss off
a few crap channels when they shift to HD. (The PBS non-commercial
stations were about the only ones to do this, as they were early adopters,
and had their transmitters going long before they rebuilt their inside
equipment).

One of the things you have there, judging from the web pages I surfed
a while back, are the audio only transmissions from the various
national stations. I wish they had done that here, but most of the
stations that used to be combined radio and TV split up into separate
corporations back 15-20 years ago, so there no organizational connection
anymore.

The situation via direct broadcast satellite is much clearer. Here, they
have so much bandwidth available that they are able to carry many HD
channels, so this is where people here get their HD content from.
Unfortunately, the satellite operator charges us another tenner ($15) a
month for the privilege of receiving these transmissions ... :-(


Sounds cheap to me, I gather you can spend $90 a month (not including
pay per view) to get the full load. Minimum, $30-$40 a month.

Mark Zenier
Googleproofaddress(account:mzenier provider:eskimo domain:com)

  #78   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 57
Default 120hz versus 240hz

Mark Zenier wrote:

Sounds cheap to me, I gather you can spend $90 a month (not including
pay per view) to get the full load. Minimum, $30-$40 a month.


It is cheap. The 10 quid (UKP) is an EXTRA fee for HD. You buy whatever
package you want, and if you want HD, you buy the HD package, which is a
few channels in HD.

Most channels are not available in HD.

Here for example, the DBS system I use has 4 movie channels, in HD they have
one. If you want all four channels, or a different movie than the HD channel
is showing, you have to watch it in regular, which means you had to pay
for that channel.

Regular def here is a mixed bag, about 10% of the programs are 16:9, most
are 4:3. The decoder box gives you a choice of always 16:9 (which means the
TV set has to detect the difference and switch), always 4:3, or letterboxed
16:9 on a 4:3 set.

I have no idea of what really is on HD, I don't have a TV capable of it.

Note that if I were to upgrade to an HDTV, I would not upgrade the service,
I get so much of my program material from other sources, it's not worth it.

Geoff.

--
Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM
New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or
understanding, as in he has a sub-wikipedia understanding of the situation.
i.e possessing less facts or information than can be found in the Wikipedia.
  #79   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 6,772
Default 120hz versus 240hz


"Mark Zenier" wrote in message
...
In article ,
Arfa Daily wrote:

"Mark Zenier" wrote in message
...
In article ,
Arfa Daily wrote:
You should also be aware that there are several 'resolutions' of screen
and
drive to take into consideration. Almost all TV showrooms both here and
in
the US, tend to have the sets running on at least an HD picture, and
often
a
BluRay picture. This makes them look very good at first glance. Problem
is
that in normal day to day use when you get it back home, you are going
to
be
watching standard resolution terrestrial broadcasts on it, and on many
sets,
these look pretty dreadful, and it is the reason that so many people are
disappointed with their purchase when they get it home, and think that
it
is
not what they saw in the store.

At least in my area (Seattle), all the mainstream over the air stations
are HD, now. (Each station has its own transmitter, so they don't have
shared "multiplexes" like in the UK). The typical situation is an
HD subchannel with the main program and one SD subchannel with some
secondary service (sports, weather, old movies, old TV show, or an
SD copy of the main signal to feed analog cable).

There's a smaller number of stations that trade selection for
resolution,
running four or five SD subchannels. Christian broadcasting and
speciality stuff like the secondary channels on the other stations.

The only way you'd get stuck with standard definition on the national
networks is to still be using analog cable. (I'm not familiar with
what you get with the two (subscription only) national direct broadcast
satellite providers).


So what band are we talking here ? Are these UHF digital transmissions ?
How
many OTA HD channels would you typically have available in any given area
?
Do you know what compression scheme they are using ?


Both VHF-High (channels 7-13, 170something to 220? MHz) (3 stations, here)
and UHF (channels 14-51(?), around 500-700 MHz) (another 10, here).
There are some VHF-low band stations in other parts of the country
but I gather that 54-88 MHz has real problems with thunderstorms and
interference.

The US channels are all 6 MHz wide, both UHF and VHF. ATSC using 8VSB
with something around 19 MBPS, using MPEG-3. As I understand it, HD
will use about 12 MBPS. The over the cable version uses a different
modulation, [mumble]-QAM, and has twice the number of bits per second.
HD in the case of ATSC may be only 720p, or 1080i.

I can't get them all, (They're clustered in 5 different locations),
but at least 8 (maybe 10) are in HD.

The bucks from auctioning off channels 52(?) to 69 to the cell phone
and wireless companies is what got the government to push this through.

The digital terrestrial TV being provided here in the UK now, currently
carries no HD content, despite ongoing promises. This is due to some
extent
on the government reneging on a promise to make more of the UHF band
available to the broadcasters. Having now told them that they can't have
any
more, and the broadcasters having already filled up what they have got
available with multiplexes carrying 'proper' channels and crap channels in
a
ratio of about 1 to 5, the only option that they are now left with is to
use
another different and non standard variant of mpeg 4 compression.


Around here, since nobody had to be nice and share, they just toss off
a few crap channels when they shift to HD. (The PBS non-commercial
stations were about the only ones to do this, as they were early adopters,
and had their transmitters going long before they rebuilt their inside
equipment).

One of the things you have there, judging from the web pages I surfed
a while back, are the audio only transmissions from the various
national stations. I wish they had done that here, but most of the
stations that used to be combined radio and TV split up into separate
corporations back 15-20 years ago, so there no organizational connection
anymore.

The situation via direct broadcast satellite is much clearer. Here, they
have so much bandwidth available that they are able to carry many HD
channels, so this is where people here get their HD content from.
Unfortunately, the satellite operator charges us another tenner ($15) a
month for the privilege of receiving these transmissions ... :-(


Sounds cheap to me, I gather you can spend $90 a month (not including
pay per view) to get the full load. Minimum, $30-$40 a month.

Mark Zenier
Googleproofaddress(account:mzenier provider:eskimo domain:com)


It's not too bad, price-wise. The (basic) digital terrestrial service here
is all free - if you ignore the cost of the required broadcast receiving
licence. That's all of the 'national' channels that were available as part
of the analogue service, plus other offerings from the same broadcasters,
such as BBC3 and BBC4 and ITV2 and so on. On top of all of these channels,
there is a cartload of other channels that are a real mixed bag of
'specialist' and 'kinda watchable' right through to utter crap that is a
total waste of bandwidth and transmission resources. There are also, as you
say, many radio stations. And finally, some quite clever 'interactive'
services. It would seem, in the not too distant future, that there will be
some HD content on there, shoehorned in amongst what's already there. Mpeg 4
DVB T-2, I seem to recall reading somewhere. All of this is broadcast in the
UHF band CH21 to 68 - about 470MHz to 860MHz, alongside the existing
analogue transmissions, which are being totally phased out over the next two
years. I believe that after the last analogue has gone, the UHF TV band is
being shrunk, to allow the government to sell more of it off, and some of
the high-band multiplexes will shift down-band.

Direct broadcast satellite TV is very well established here, and is mostly
operated by a private company called Sky. It's one of Murdoch's News
Corporation companies. A lot of basic content can be viewed on a Sky
receiver for free, although there is a one-off small charge for a
free-to-air viewing card. Outside of this content, you move into the realms
of pay TV. Sky have cleverly designed a very complex system of putting
together your own viewing package. They have grouped channels by genre, and
then you pick those genres to assemble your package. I take all of the
channels except premium movies, and premium sports. It costs me 22UKP (about
$33) a month, and provides me with a very wide range of content.

When I recently bought my HD TV set, I then added an additional bundle to my
package. This was all of the available HD channels, ex the premium movie and
sports channels. For this, I pay an additional 10 UKP, so my total monthly
cost is now 32 UKP (about $48). Whilst this is not exactly 'cheap', I also
don't think that it represents too bad value for the entertainment value it
provides, although there are many who disagree with that point of view, and
think that Sky as a company, belongs to the devil himself ...

Of course, there are hundreds of free radio stations on the satellite, as
well.

Signals from the same suite of satellites can also be received via a system
called FreeSat. This allows you to receive pretty much the same free
channels that you would get on terrestrial FreeView or free-to-air Sky, as
well as a very limited amount of HD content, all for free.

Then, of course, there's always cable ...

It's all getting a bit complicated here now, and I don't see much scope for
it all settling down, until the analogue services have finally all stopped,
which will then force people to make service provider choices, more than
they are doing at the moment.

Arfa


Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Sprayer versus roller versus power roller [email protected] Home Repair 15 February 3rd 07 02:26 AM
VFD versus Phase Converter versus 3-phase power Jay Pique Woodworking 8 October 28th 05 12:36 AM
Kero versus propane versus natural gas for heat Stormin Mormon Home Repair 1 October 27th 05 04:12 PM
Dadonator versus Forrest versus Freud -- comparisons Never Enough Money Woodworking 7 July 21st 05 06:45 PM
Heat pump versus oil versus propane in southern NH Gerome Home Ownership 3 October 7th 03 05:26 PM


All times are GMT +1. The time now is 03:04 PM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 DIYbanter.
The comments are property of their posters.
 

About Us

"It's about DIY & home improvement"