View Single Post
  #3   Report Post  
Posted to sci.electronics.repair
Dave Platt[_2_] Dave Platt[_2_] is offline
external usenet poster
 
Posts: 201
Default size voltage divider I need for transceiver to spectrum analyzer?

In article , Jason wrote:
I just picked up a small spectrum analyzer to check out some
transceivers I have. They are 10 meter ham units and put out anywhere
from 4-12 watts @ 50 ohms. The new spectrum analyzer can handle a
maximum of 10 dbm. What size voltage divider would I need to reduce the
transceiver wattage to less than the 10 dbm for the analyzer? Thanks.


1 watt is 30 dBm. 4 watts would be 36 dBm. 12 watts would be about
41 dBm.

In order to avoid driving the spectrum analyzer into overload, I'd
suggest keeping the input signal at about 1 milliwatt (0 dBm). This
will also give you some safety margin to avoid exceeding the 10 dBm
limit.

40 dB of attenuation would take you down from 12 watts, to 1.2
milliwatts, so that's a good figure to shoot for.

So, you'd want either one 40 dB attenuator, or two 20 dB attenuators
in series. Make sure that the attenuator is rated to handle the full
output of the transmitter, or you'll cook it.

A single 30 dB attenuator would keep you under your 10 dBm worst-case
limit if you feed it with 4 watts, but you'd be a hair over the limit
at 12 watts. So, I'd stick with 40 dB of attenuation.

Make sure these are 50-ohm attenuators, of course, since that's what
the transmitters will expect (and want) to see.