Electronics (alt.electronics)

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #1   Report Post  
B
 
Posts: n/a
Default Once and for all... Amperage?

I've read every electricity intro that I can find, but I still cannot find
the answer to one question about amperage:

Does a device (e.g. computer peripheral, 12 volts DC, one amp) use only the
amperage that it needs, or does it accept the amperage that it is given. To
put it another way, if I provided the above example with a power source of
12 volts, *3* amps, would the device operate safely and normally, or would
the extra amperage overload it?

My instinct tells me that, knowing how amperage and wattage are related, a
device will only use what it needs. If that is the case, then a power
source with too much amperage will allow the device to operate normally,
safely, and for a very long period of time since it's using less power than
the power source is made to give. Nevertheless, I can't melt a perfectly
good piece of equipment because I refused to admit ignorance.

Your help is appreciated.

Brennan


  #2   Report Post  
Alexander
 
Posts: n/a
Default


"B" schreef in bericht
news:sy%Ie.28780$HV1.4937@fed1read07...
I've read every electricity intro that I can find, but I still cannot find
the answer to one question about amperage:

Does a device (e.g. computer peripheral, 12 volts DC, one amp) use only
the amperage that it needs, or does it accept the amperage that it is
given. To put it another way, if I provided the above example with a
power source of 12 volts, *3* amps, would the device operate safely and
normally, or would the extra amperage overload it?

My instinct tells me that, knowing how amperage and wattage are related, a
device will only use what it needs. If that is the case, then a power
source with too much amperage will allow the device to operate normally,
safely, and for a very long period of time since it's using less power
than the power source is made to give. Nevertheless, I can't melt a
perfectly good piece of equipment because I refused to admit ignorance.

Your help is appreciated.

Brennan

Most PSU Power Supply Units Supplies a voltage, this voltage is stable on
the voltage given by the supplier.
If you have a 12V/1A Power supply, the voltage is stable to 1A load, if you
come above the 1A some might pull the voltage down others my overload or
shut down.
If you pot a 24 Ohms resistor to this PSU the cuurent drown form it will be
0.5A because U=I*R.
So you can savely use a 3A PSU where you only need 1A.

One warning with this some PSU are really dedicated to one load and the
voltage will change if the load changes to much.

With the question you asked, the device must have a 12V Power Supply and
draws the current it needs. The 1A is the maximum under normal
operation/startup. So if you add more devices to one power source you should
add the Amps given and make sure the PSU can supply at least this amound (if
you use all the devices at once).

Hope this gives you more insight and answers your question,

Alexander


  #3   Report Post  
B
 
Posts: n/a
Default

Yes, that was very helpful. Thank you.

For my current projects, I forsee needing to create special power supplies
quite often. This will probably mostly involve putting together battery
packs, adding resistors or otherwise tweaking existing power supplies. Do
you (or any reader) know of a good source of education on this subject that
would focus on practice instead of theory? Basically I need to know how
long a battery will last a lot more than I need to know what a sin wave
looks like right now.


"Alexander" wrote in message
...

"B" schreef in bericht
news:sy%Ie.28780$HV1.4937@fed1read07...
I've read every electricity intro that I can find, but I still cannot
find the answer to one question about amperage:

Does a device (e.g. computer peripheral, 12 volts DC, one amp) use only
the amperage that it needs, or does it accept the amperage that it is
given. To put it another way, if I provided the above example with a
power source of 12 volts, *3* amps, would the device operate safely and
normally, or would the extra amperage overload it?

My instinct tells me that, knowing how amperage and wattage are related,
a device will only use what it needs. If that is the case, then a power
source with too much amperage will allow the device to operate normally,
safely, and for a very long period of time since it's using less power
than the power source is made to give. Nevertheless, I can't melt a
perfectly good piece of equipment because I refused to admit ignorance.

Your help is appreciated.

Brennan

Most PSU Power Supply Units Supplies a voltage, this voltage is stable on
the voltage given by the supplier.
If you have a 12V/1A Power supply, the voltage is stable to 1A load, if
you come above the 1A some might pull the voltage down others my overload
or shut down.
If you pot a 24 Ohms resistor to this PSU the cuurent drown form it will
be 0.5A because U=I*R.
So you can savely use a 3A PSU where you only need 1A.

One warning with this some PSU are really dedicated to one load and the
voltage will change if the load changes to much.

With the question you asked, the device must have a 12V Power Supply and
draws the current it needs. The 1A is the maximum under normal
operation/startup. So if you add more devices to one power source you
should add the Amps given and make sure the PSU can supply at least this
amound (if you use all the devices at once).

Hope this gives you more insight and answers your question,

Alexander



  #4   Report Post  
DBLEXPOSURE
 
Posts: n/a
Default


"B" wrote in message
news:sy%Ie.28780$HV1.4937@fed1read07...
I've read every electricity intro that I can find, but I still cannot find
the answer to one question about amperage:

Does a device (e.g. computer peripheral, 12 volts DC, one amp) use only
the amperage that it needs, or does it accept the amperage that it is
given. To put it another way, if I provided the above example with a
power source of 12 volts, *3* amps, would the device operate safely and
normally, or would the extra amperage overload it?

My instinct tells me that, knowing how amperage and wattage are related, a
device will only use what it needs. If that is the case, then a power
source with too much amperage will allow the device to operate normally,
safely, and for a very long period of time since it's using less power
than the power source is made to give. Nevertheless, I can't melt a
perfectly good piece of equipment because I refused to admit ignorance.

Your help is appreciated.

Brennan


A computer peripheral's power adapter amp or current rating is a measure of
its capacity to deliver current. Using an adapter with a higher current
rating will not damage the peripheral.



If you are replacing a power adapter be mindful of polarity as from time to
time manufactures will use center negative power connections. Connecting
adapter with the wrong polarity could result in damage to the peripheral.
Usually, the polarity will be marked at the power jack and/or on the
adapter.


  #5   Report Post  
Chris Head
 
Posts: n/a
Default

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

B wrote:
Yes, that was very helpful. Thank you.

For my current projects, I forsee needing to create special power supplies
quite often. This will probably mostly involve putting together battery
packs, adding resistors or otherwise tweaking existing power supplies. Do
you (or any reader) know of a good source of education on this subject that
would focus on practice instead of theory? Basically I need to know how
long a battery will last a lot more than I need to know what a sin wave
looks like right now.



Hi,
The basic idea with batteries is as follows:

Batteries in series add their voltages. Batteries in parallel add their
currents or, more usefully, their "lifetimes". Putting 2 identical
batteries in parallel will give the same voltage and last twice as long.

To get a basic estimate of how long a battery will last, you must first
find out the amount of current (amperage) that your circuit will be
drawing, in milliamperes (mA). 1A=1000mA, so multiply amperes by 1000.
Next, discover (somehow) the energy rating of the battery, which for
most batteries will be in milliampere-hours (mAh). I guess this would
probably be available in a manufacturer's datasheet. Divide this by your
circuit's milliamperes, and you're left with hours of lifetime.

Batteries don't unfortunately provide exactly their rated voltage for
exactly their rated lifetime and then instantly die. Their voltage
usually slowly degenerates over time. Still, this should give a
reasonable estimate.

For example, looking on the Energizer website, I found the datasheet for
the AA battery, number E91. The "average capacity" is listed as 2850mAh.
The voltage is listed as 1.5V. If this battery is connected across a 1K
resistor, the current will be 1.5/1000=0.0015A=1.5mA. 2850mAh/1.5mA=1900
hours (approximately). The "internal resistance" of a battery should
really be added to the resistance of the circuit it's powering when
calculating current consumption, but this battery's datasheet shows 146
milliohms, or 0.146 ohms, which is pretty insignificant compared to 1K.

If you get into electronics as a hobby in any big way, you'll probably
want to find a bench power supply. Mine is just really basic, able to
supply 0-30VDC at 0-5A with current-limiting capability. Depending on
what kind of circuit you're building, having a bench supply with a
current-limiting mode might even protect you from blowing up an
expensive component due to a wiring error, and of course you get any
voltage you want, while it's pretty hard to get 5V (eg for ICs) using
only domestic batteries.

Anyway, you sound like a newcomer to electronics... have fun!

Chris
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.1 (MingW32)

iD8DBQFC9We36ZGQ8LKA8nwRAi9tAJ0TpPwyIayUBNdH1uoXjw ltt8doNgCgs0dy
3YkVxTsLLwR/p2dJ8RfzkaU=
=T1cZ
-----END PGP SIGNATURE-----


  #6   Report Post  
mhyk
 
Posts: n/a
Default

(amperage:Current flow, or electron flow, is measured in
amperes. While we normally consider that one ampere is a rather small
current of electricity (approximately what a 100-watt light
bulb would draw), it is actually a tremendous flow of
electrons. More than 6 billion electrons a second are required to
make up one ampere)
you can find more of this in this link
http://www.tpub.com/content/construc...4250/index.htm

  #7   Report Post  
Alexander
 
Posts: n/a
Default

"mhyk" schreef in bericht
oups.com...
(amperage:Current flow, or electron flow, is measured in
amperes. While we normally consider that one ampere is a rather small
current of electricity (approximately what a 100-watt light
bulb would draw), it is actually a tremendous flow of
electrons. More than 6 billion electrons a second are required to
make up one ampere)
you can find more of this in this link
http://www.tpub.com/content/construc...4250/index.htm


Actually a 100 Watt light Bulb would draw less dan 1/2 Ampere.

You see not all people live in the US where they still using an outdated
110V Net


Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On



All times are GMT +1. The time now is 03:49 AM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 DIYbanter.
The comments are property of their posters.
 

About Us

"It's about DIY & home improvement"