View Single Post
  #25   Report Post  
Posted to rec.crafts.metalworking
Jon Elson[_2_] Jon Elson[_2_] is offline
external usenet poster
 
Posts: 169
Default 50 hz VS 60 hz and a 120 HZ question



Randy wrote:
OK, we rule out 400HZ as too high, due to the 1/4 wave length problem,
would it be worth while to go to 90 or 120HZ?

One could argue the thing forever, as the losses are almost all
engineering tradeoffs when the various components are designed.

But, it is all moot. I have no way to calculate the cost, but if the
entire generating, transmission and distribution system had to be
replaced, plus most appliances more complicated than a toaster or light
bulb, the cost would certainly be in the trillions of $. Couldn't
possibly be any less than that. Maybe some power house alternators
could be re-wound for 4-poles to get the required frequency, and the
transmission lines could probably stay, but that's about it. Now, for
the transformers, the Bmax would go down at the higher freq, assuming
same voltage, and that would help the hysteresis loss, but I think some
of the other losses would go up.

But, this stuff gets REALLY complicated. For instance, the leakage
inductance of the transformers is carefully engineered to control fault
currents. At higher F, the fault currents would go down, but that means
voltage regulation would get worse, too.

Standard around here anyway is uninsulated high tension lines, would
insulation help?


Absolutely not at all. You could try shielding them, but that would be
insanely expensive. But insulation does not prevent the electric and
magnetic fields from radiating, so it would have no effect.



I read a few articles in my welding mags where a shop bought all new
inverter type power sources and paid them off in less than a year in
power savings, so there has to be less power drawn by the supply,
correct?

it could just be that they were getting blitzed by the power company for
having terrible power factor with the old welders. The way old welding
power supplies were made, they really were quite efficient, but the
transformers were intentionally made to have huge leakage inductance,
that's how they limited the welding current. Thus, the native power
factor was insanely bad. So, they had to add power factor correcting
caps, which drew a huge leading PF when not welding, and were a
compromise that only corrected the PF at one welding current.

Depending on the type of meter they had, they could get dinged REALLY
hard for the bad PF.

Jon