View Single Post
  #34   Report Post  
Posted to sci.electronics.repair
Jeff Liebermann Jeff Liebermann is offline
external usenet poster
 
Posts: 4,045
Default Unused Li-ion battery pack

On Mon, 05 Oct 2009 16:45:09 +0800, who where wrote:

It was a jumper on pin headers, extending it to the front panel would
be a snap.


I was thinking more of something with a built in ethernet or USB port.
All the charge parameters can be setup on a web page. Once one has a
suitable processor, adding features such as battery history, battery
test, counterfeit detection, run time calculation, and fire detection
are mostly software. I'm half way inspired to design and build one
for myself but suspect that I can't make much money on it at consumer
price levels.

The controller I used was the MAX1737.

http://www.maxim-ic.com/quick_view2.cfm/qv_pk/2217
You get a certain amount of
flexibility designing around it, and we (the client and I) preferred
their regime to the others we considered.


Ok, but that's pure analog. Analog is not a problem but it does limit
what weird things can be done with a Li-Ion charger. I was thinking
more in the way of a digital (i.e. PIC controller) design, such as:
http://www.microchip.com/stellent/idcplg?IdcService=SS_GET_PAGE&nodeId=1406&dDocName =en024090
http://ww1.microchip.com/downloads/en/DeviceDoc/51515a.pdf
and increasing the output current capabilities to run larger battery
packs. For example, when the battery pack is in the charger and
allegedly fully charged, it would be fairly easy to apply a load and
discharge it for perhaps a minute or more. The asymptote of the
terminal voltage curve can be extrapolated to produce an estimated
runtime. Some of the remote battery management systems already do
this quite accurately. One could also include some RAM and add a data
logger and coulomb counter (amp-seconds). The area under the current
curve is the charging and discharging energy. This would give a good
clue as the battery packs comparative quality (something that I
suspect the manufacturers would not be interested in supplying).

The selection was vendor-made based on the end user's stated role, and
was selected in_conjunction_with pack sizing. Any time the role
looked like prolonged high SOC and temperatures above 30C, he'd go
with 4v10 and the pack size would then be determined. He didn't want
premature failures, unlike laptop makers who don't give a rats.


Good plan. However, there's always going to be the customer that
plugs in a new battery pack, runs it as long as possible, and then
proclaims that they're not getting the specified run time.

Maybe I missed your objective. I understood it to be determining when
to stop discharge (in the laptop) to achieve a chosen SOC. Any time
you are playing with discharge the laptop activity is fundamental, but
for a known target SOC it isn't hard to invoke a known task (eg screen
saver) and do the linear maths.


It was to see how close to a full charge was being used and where the
battery droop detection was set. That's measured in coulombs
(watt-seconds) or run time (hours). I have a Kill-a-Watt meter that
measures power consumption from the 117VAC line. I run the laptop
only from the charger, with the battery removed, for about 30 minutes
(the limit of my attention span), doing what I consider to a typical
applications mix. The Kill-a-Watt meter records the watt-seconds
(actually watt-hrs) used. It also compensates for power factor. I
throw in the switcher efficiency of about 85-90%:
http://www.energystar.gov/index.cfm?c=ext_power_supplies.power_supplies_cons umers
and calculate the energy consumption per hour of use. I use that as
the average load for run time calculations.

--
Jeff Liebermann
150 Felker St #D
http://www.LearnByDestroying.com
Santa Cruz CA 95060 http://802.11junk.com
Skype: JeffLiebermann AE6KS 831-336-2558