View Single Post
  #4   Report Post  
Posted to sci.electronics.repair
mike mike is offline
external usenet poster
 
Posts: 634
Default OTA tuner sensitivity

klem kedidelhopper wrote:
I made an interesting observation today and thought that I would
relate it
to the group. I use a set top converter in the shop to receive OTA
signals
from my antenna. I just put up the antenna this past Fall so I don't
have
much history with this particular antenna. However I can say without
hesitation that for the past 33 years at this same location as soon as
the
leaves start to come out in the Spring, our UHF reception would begin
to
degrade. We are in a deep fringe area so UHF reception has always been
boarder line on many channels. Obviously digital has made that
situation
even more difficult for us. With the new installation though, an old
Channel Master parabolic that I resurrected, with a Winegard GA8780
preamp,
things have improved quite a bit over the old Yagi we had been using.

So today I just finished up a three year old Samsung and auto
programmed it
off my antenna. I noted many more channels than my bench converter had
previously picked up and stored. So then I went to the converter and
manually entered the new channels. To my surprise I did get most, but
not
all of the new ones. So perhaps propagation has changed somewhat. I
don't
know. In any case there is definitely a difference in front ends on
these
two receivers.

There was this one channel in particular that looked fine on the
Samsung
but would intermittently very lightly "pixelate" when processed
through my
set top converter. I removed the 2 way splitter feeding the bench and
the
set and fed the bench directly thereby increasing the level to the
bench by
3DB. The weak channel was improved but still would break up, (but less
pronounced) on occasion.

What frustrates me is that except for the OSD "bar graph" signal
indicators
on most digital sets today, I really can't tell what the actual signal
strength of a particular station is anymore. It used to be so easy. I
would
connect up my old Sadelco analog meter to a cable or antenna coax and
simply read the carrier level. Now that we have two different types of
TV
broadcasting systems to deal with all my equipment is obsolete.
So I have two questions to pose to the group:
1. Has anyone noted consistent instances where one tuner is noticeably
"hotter" than the other? That is to say one set that will perform with
a
weak signal where the next being fed the same signal might break up?
This
slight difference in the front ends, which might never be detected on
cable
could affect antenna users in fringe areas like myself. In fact I
don't
know if I ever would have noticed this if I hadn't been running two
different sets in the shop at the same time).
2. If it's economically feasible I would like to be able to read
signal
level directly on a test instrument. This would be useful on antenna
jobs
as well as cable distribution systems. Is there such an affordable
instrument available for this purpose? Thanks, Lenny


I didn't figger out from your post where you're located. Makes a
difference.
Here in the good ole USA, OTA HDTV sucks. Encoding was picked by
lobbyists, not engineers.

I'm only 10 miles from the transmitters, but there's a hill in the middle
and a two-story metal pole building in my neighbor's yard. It's
multipath city here. Ability to decode a channel also depends
on the season and how many leaves the intermediate trees have.

I have a combo VHF/UHF antenna on a rotor, but if you wanna record more
than one thing at a time, you can't be turning the antenna.

The Zenith DTT901's work pretty well, unless it's raining hard, but
the output isn't HD.

I have several older cable boxes that work on some channels.
It's particularly annoying on some boxes that there's no way to
add channels. If you need a different antenna direction to detect
a channel, you can't find all the channels with the auto scan.

The ATSC tuner card in the PC works the worst.
And it's not signal strength.
I put a variable attenuator in the antenna line and tweak it per channel
for fewest dropouts.

Before the big switch, when most of the ATSC channels were on UHF, signals
were much better. After the switch, when they went back to VHF ATSC,
it got much worse.

I put the signal on a spectrum analyzer. There's no correlation between
signal strength and stability of the picture. The "Bart's head" display
is supposed to be flat on top. There is a correlation between the flatness
of the top and the stability of the picture.

I have a friend in the industry. He won't disclose details, but he claims
that the math to de-multipath the signal exists. But it takes more
computing horsepower than you can get into a cheapo set top box with current
technology.

I'm unwilling to spend much money on this. The whole idea of free TV
is that it be FREE. I'd much rather have NTSC quailty than "no signal"
scrolling across the screen of what I thought I'd taped.