View Single Post
  #29   Report Post  
Posted to alt.home.repair
[email protected] hallerb@aol.com is offline
external usenet poster
 
Posts: 6,199
Default Anyone know if the HD tv signal will be stronger?

On Oct 24, 11:18�am, wrote:
On Oct 23, 1:34�pm, HerHusband wrote:

The converter box is a Digital Stream bought at Radio Shack, The
antenna is abut 30' up , no tower, just on the eve of a 2 story
I estimate my line of sight at 50 miles with near by hills to
interfere, my Amp is Winegard not sure of the model, it mounts on the
pole with a power supply in the house. I just bout the biggest DB gain
I could find.


Digital is much more sensitive to background noise than analog was. We
tolerated noise and ghosts in an analog picture, but you'll get drop-
out's or no picture at all if there's noise on a digital signal.


That's not true. �One of the biggest advantages of any digital
transmission is that it can tolerate much higher levels of noise and
still deliver a perfect signal. � On the receiving end the system only
needs to determine if a "1" or a "0" has been received, the absolute
value of the signal level doesn't effect the data. � �Hence, you can
tolerate a lot of noise, still be able to determine which of the two
it is, and recover the signal.

It is true that if the signal is poor enough instead of seeing a
crappy picture that you might have had with analog, you'll get no
picture. � �But for the vast majority of people receiving via OTA,
that is an acceptable tradeoff.





Unless you are distributing your signal to multiple TV's or VCR's, an
amplifier is likely to introduce more noise and distortion than the weak
signal it's trying to improve (noise is boosted along with the signal).


Also, most digital stations are currently in the UHF band, so you'll need
to make sure your amp is rated for that. Many older amps were only rated
for the VHF band.


In my case, I found my signal to noise ratio (SNR level) was actually
LOWER with a new 15db UHF rated amplifier than just running the cable
directly from the antenna.


Our closest is about 60 mi. �The antenna is on a tower but it also sets
at eave height of the two-story farmhouse.


Sometimes moving left or right just a bit can make a big difference in
signal strength too. It all depends on topography and obstructions
between you and the source.


My SNR varied about 5-10 points depending on where I walked with my
antenna, and mounting the antenna outside raised my SNR about 15 points
compared to mounting in the attic. All pointed in the same direction, of
course.


Somebody else mentioned the twin-lead--it's what Dad pulled when he
redid the house in the mid-70s so that may be a weak point but
previously a coax run directly didn't make much difference altho I've
not tried the experiment w/ the digital box.


Twin-lead is out of the question, and you should be using at least RG-6
coax cable for the best shielding. I also found the cheap "crimp-on"
style of connectors let a lot more noise in than the professional
compression type of connector.


I spent about 3-4 months trying to get the best signal levels, and I'm
only 20 miles from the broadcast antennas (all in the same basic
direction). I tried a variety of antennas and locations, before settling
on a DB2 style of antenna mounted outside on the eave of our single-story
house. �For now everything is working well since my digital stations are
all on the UHF band. But come February, a few of my stations are moving
back to the VHF band so I may need to get a different antenna to pick up
the lower frequencies.


Anthony- Hide quoted text -


- Show quoted text -

around north carolina all the analog signals were turned off some
months ago as a test. about 20% overall decrease in viewers who
couldnt get signals anymore.

the turn off will be delayed by years, right now we wouldnt be told so
manufacturers can use up their inventory oif digital decoders........