View Single Post
  #1   Report Post  
Posted to sci.electronics.repair
Dan_Musicant Dan_Musicant is offline
external usenet poster
 
Posts: 42
Default Can a receiver's sensitivity diminish over time?

I have a mini-stereo (5 year old Sony MHCMG110) and stations that it
used to receive OK now are noisy (static, typical noise for a weak
station). I almost always listen to low power stations (college radio),
and their output is weak, of course. However, stations that used to come
in OK now sound crumby. It varies from day to day, but overall it seems
worse and I wonder if I can attribute this to some kind of deterioration
in the system. The last week or so it seems worse than ever. Of course,
it could just be that the 2-3 stations I listen to have changed their
pattern of transmission or their transmitters are having problems.

The antenna I use for this system is a dipole that I have mounted on a
swiveling rabbit ears, and I've had that antenna since the mid-1970's. I
have several dipole antennas, but I always got the best reception with
this particular one, so I use it on the rabbit ears, which is just a
homemade affair made from wood. The poles are in one straight line, and
horizontal and I can rotate that line like a compass needle. I tested
the leads yesterday with an ohmmeter and there's continuity between
them. Cleaned the leads and reiniserted in the antenna input for the
system, but there's no evident change.

The dipole antenna is basically like this, and can rotate around that
center point for best reception:
______________
|
|
|
|
|


I've been thinking of doing an additional split on one or both of my two
rooftop TV antennas and running feeds to the ministereo. If two coaxial
feeds, I'd have a switch. These two antennas point in different
directions, basically 90 degrees apart, so I could get more stations
that way. I know I'll have to use a balun if I use coaxial antenna
inputs (I have some baluns).

Any ideas? Thanks!
Email: d plus musicant at pacbell dot net