Home |
Search |
Today's Posts |
#1
![]() |
|||
|
|||
![]()
I've read every electricity intro that I can find, but I still cannot find
the answer to one question about amperage: Does a device (e.g. computer peripheral, 12 volts DC, one amp) use only the amperage that it needs, or does it accept the amperage that it is given. To put it another way, if I provided the above example with a power source of 12 volts, *3* amps, would the device operate safely and normally, or would the extra amperage overload it? My instinct tells me that, knowing how amperage and wattage are related, a device will only use what it needs. If that is the case, then a power source with too much amperage will allow the device to operate normally, safely, and for a very long period of time since it's using less power than the power source is made to give. Nevertheless, I can't melt a perfectly good piece of equipment because I refused to admit ignorance. Your help is appreciated. Brennan |