View Single Post
  #1   Report Post  
Posted to sci.electronics.repair
klem kedidelhopper klem kedidelhopper is offline
external usenet poster
 
Posts: 412
Default Power supply question

I'm in the process of building a dual purpose power supply. The supply
will have two outputs. One will be 24VAC rated at 4.0 A. The load on
this supply will probably never exceed 2.0 A. The second supply will
consist of a bridge rectifier off the 24V tap connected to a suitable
filter cap of say 1000UF. This filtered DC will then be connected to a
small surplus 12V regulator board and heat sink assembly which by the
looks of it can handle 5A or better. I plan on mounting this regulator
in the cabinet and providing a terminal strip for the 12V output. The
DC load will probably never exceed .500 - .750 A.

Now here is the dilemma. I haven't tried to put all this together yet,
but I know that once I rectify and filter the 24VAC I'll probably end
up with something like 30VDC out. The regulator uses a ua723, a TO3
and a smaller TO transistor as well a whole bunch of discrete
components. I have no specs on this regulator, but it is well built,
appears to be commercial grade, and although it might handle it fine
I'd feel a lot better hitting it with something like 18VDC instead of
30. I would hate to blow it up trying to see if it would work on the
higher input.

I could add some series resistance either before or after the bridge,
but the voltage drop across this resistance would vary depending on
the total load and I'm not sure how well such a scheme would work. So
I was thinking about employing a voltage divider at the output of the
filter. The resistance ratio would be easy to figure out, however I'm
just not sure how to determine the optimum resistor values. Does this
seem like a viable plan, or perhaps someone my have other thoughts as
to how to address this? If someone could please help me with this I
would be very grateful. Thanks, Lenny