View Single Post
  #11   Report Post  
DeepDiver
 
Posts: n/a
Default Router Speed Control - How does this work?

"MP Toolman" wrote in message
...

Shunt wound or permanent magnet DC motors are inherently constant torque
devices because the magnetic field is constant at or below base speed.
Any way to get variable armature voltage will produce variable speed at
constant torque. snip


Thanks for all the good info. But it is my understanding that there is a
difference in the output torque of DC motors depending on the means by which
the speed is controlled. In fact, isn't that why expensive PWM drives are
preferred over other supply circuits?

Below are some quotes that I am basing some of my assumptions upon.
Additional comments & clarifications would be appreciated!

Posted by Harvey White (msg #53845)

The SCR is capable of being turned on with a gate signal, but unlike a
FET, the SCR latches on until (basically) power is removed. The supply to
the motor and SCR is raw pulsating DC, coming from a bridge rectifier. A
gate signal is generated from a ramp synchronized to the DC waveform coming
in (this is one way of doing it). With a ram rising from , say 0 to 10 volts
repeating each 1/120 of a second (for 60 Hz power), you compare the ramp
voltage against a pot voltage (speed control). When the ramp voltage exceeds
the pot voltage, the comparator flips on, and this signal is used to turn on
the SCR. The circuit resets when the rectified DC voltage falls below the
SCR's holding voltage/current limit.

For full speed, you trigger at the beginning of the half cycle, so you get
the benefit of the full AC voltage half cycle, 0 to peak to 0. At half
setting, you trigger at the middle of the cycle (and get your worse spike
for this circuit). You've "ignored" the first 1/2 of the half cycle. At low
RPM, you trigger very late in the cycle, and you're getting the part of the
waveform that's at, say 25 volts descending to zero going to the motor.

Since Motor torque is related to the supply voltage, The torque falls from
half speed down to zero. It's the same effect as if you had a variable DC
supply. That's why running a DC motor off a variable DC supply gives you
"bad" low end torque, low motor voltage/current.

Where a PWM controller is superior is that the motor voltage is always the
full value. Since the supply is switched on and off at a rapid rate, the
current flowing through the motor (when it stabilizes) is always the design
normal value, and the torque in the motor is always (or mostly) the same,
regardless of speed setting. Since the motor is an inductor, and since you
can't change the current through an inductor instantaneously, there will be
a practical limit to the minimum width of the pulse through the motor. Below
this limit, the current through the motor does not have the opportunity to
reach full value, and the motor torque drops. However, the theory says that
with a good PWM supply, the motor torque is higher at low speeds than with
an SCR supply.

However, SCRs are more robust than FETs, so they go poof less....

There used to be devices called GTO SCRs, for Gate Turn Off SCR (SCR is
Silicon Controlled Rectifier). You could turn them off, you could turn them
on.

Ironically, you could make one with an SCR and a MOSFET....


Posted by Rick Dickinson (msg #53864)

For those that aren't familiar with PWM, here's the Cliff Notes version:
Pulse Width Modulation uses some sort of fast, low resistance, electronic
switch (like a MOSFET) to turn the DC voltage driving the motor on and off
thousands of times per second. By changing the percentage of the time that
the switches are "on" rather than "off", the average power being applied to
the motor varies proportionally, and the speed varies as well.

By contrast, an SCR-based motor controller uses devices called "Silicon
Controlled Rectifiers" to turn on and off the power to the motor. On the
"pro" side, SCRs are very robust devices, and hold up well to abuse without
releasing their magic smoke. However, on the "con" side, they are peculiar
devices: they can only be turned on by a triggering signal, not off. How do
you use a switch that you can only turn on, not off, to control the power
going to a motor? By taking advantage of one other feature of SCRs: they
actually do turn off when the voltage across them drops to zero. So, if you
use them with an AC signal (a 60 Hz 110V RMS sine wave from the power
outlet, for instance), you can turn them on at whatever point in the sine
wave you want, and they shut off all by themselves every time the voltage
passes zero (120 times a second, for 60 Hz AC). Rectify the output, so that
all of the "humps" of the sine wave are positive, and you've got a robust
source of pulsing voltage that can be controlled as to what percentage of
total power it delivers by changing at what point you turn the SCRs on.

Now, since MOSFET-based PWM motor controllers turn on and off thousands of
times per second, you have very fine granularity across the whole range from
full on to full off. However, SCR-based controllers are dependent on the 60
Hz AC sine wave from your power outlet. If you've ever looked at a sine
wave, you can see that the curve slopes a lot more right near where it
crosses zero than it does near the peaks. This means that, at the low end of
the power curve (near full off), a small adjustment makes a big jump in
speed. Also, no matter what speed is selected for a PWM-controlled motor,
the motor always sees pulses of full voltage, which means that the motor
always gets a full-strength "kick" to get it started moving as soon as the
pulse hits it. An SCR-based controller is sending rounded pulses (shaped
like part of a sine wave). At low speeds, the motor never sees full peak
voltage, which makes it more likely that the motor will stall at low speeds.
So, to summarize: PWM with MOSFETs gives smoother control over the whole
range of speeds, while SCR-based controllers are more robust, and give their
best control at mid-to-high-range speeds.