View Single Post
  #16   Report Post  
Posted to sci.electronics.repair
who where who where is offline
external usenet poster
 
Posts: 112
Default Testing Nicad/nicd or NiMH cell ESR or SLA Gel Cell Battery Internal Resistance

On Sat, 16 Jan 2010 08:06:06 -0500, "Wild_Bill"
wrote:


"who where" wrote in message
news
On Fri, 15 Jan 2010 18:48:02 -0500, "Wild_Bill"
wrote:

I believe that just because a battery cell test shows a full charge,
doesn't
mean that the full charge can be gotten out of the cell (imbalance).

Checking ESR at random intervals could indicate if cells are unbalanced, I
think.


I don't understand why you believe ESR and imbalance have any
relationship whatsoever.


I don't presume to know.. internal resistance is a factor, maybe not the
most critical, but a parameter that could be useful. As I've said, I believe
ESR/IR is more useful than a simple battery test meter.
As the cell's IR increases, it's capacity drops, from what I've read.


No. Note TWO relationships: (1) Internal resistance will have an
effect on the usable capacity, in that it will result in earlier
discharge termination at the terminal voltage will droop more under
load. (2) As the cell's usable capacity reduces due to "wear and
tear", the internal resistance will usually rise for similar reasons.
But IR is NOT the cause of reduced capacity.

Charging at C/10 is said to balance cells, but how often? Having a means
to
indicate (actually see) imbalance would give the user an idea of how
often.

Yes, BUT most charging regimes for NiXX chemistries are current-based
and overcharge the cells somewhat. This provides SOC balancing.


The info I've found indicates the opposite.


Well you can't have it both ways. About 8 lines above, you state that
"Charging at C/10 is said to balance cells". Pick one.

imbalance isn't improved by continuing to recharge in the usual manner (the same charging techniques).


First let's be clear that we are talking about SERIES-connected cells.
When exposed to charging current cells will recharge. If cells in a
series string have a different SOC at the commencement of recharge,
the *normal* (i.e. over-) charging regimes - including the "C/10 for
14 hours" - will ensure that the cells with lower SOC are dragged
closer to the SOC of the others. This may or may not take more than
one charge cycle, but it will happen.

What this means in terms of cells with different usable capacities (as
distinct from different internal resistances) isn't a real issue.
Discharge is another story.

I believe that imbalance is related to a battery pack's longevity.


It certainly can/does lead to cell reversal under deep discharge,
which can damage a pack.

And since most battery users can't improve or modify their existing
chargers' behaviors, they're stuck with what they have.


Agreed. We disagree though on what that means.

I prefer to test capacitor ESR and other parameters before I install them,
as there are always inconsistencies in manufacturing.


Do whatever you feel important. I wouldn't bother.


No, I don't routinely retest them at regular intervals after installation,
but when circuit performance is poor, ESR testing is generally the first
step.


Whose first step?


I was referring to capacitors here


OK, my error.

so ask anyone that regularly performs
service or repair work how important their ESR testers are.


I use my ESR meter frequently, but extremely rarely on anything except
electrolytic caps..

The only reason I would want to zap nicad cells is to actually see if the
internal resistance is improved/lowered. I believe an ESR test would be a
good indication of any change that takes place.


"zapping" NiCd cells will *only* (and then temporarily) remove
conductive dendrites. If the cells have dendrites then they are
already on their last legs, and should be discarded before they cause
you grief. If they don't have dendrites, *zapping" will achieve
nothing, but hey, whatever floats your boat.


Battery manufacturers and serious users disagree.. they've proven that
zapping new, high quality Nicad cells increases output voltage levels and
lowers the ESR/IR.


If battery manufacturers beleived that, they'd all be doing it
routinely before shipping their product.

The reported capacitor values and zapping voltages vary, so I don't think
there is an ultimate combination.


I believe it is just more snake oil, but ....

As I mentioned earlier, I'm not interested in trying to revive old, weak
cells.

Otherwise, an ESR test after cells have been used a while, may be a better
indication of it's condition, and possibly it's reliability.



ESR (or more correctly, internal resistance) tests on a healthy cell
will indicate maybe the extent of irreversible deterioration. To that
extent it may provide some useful information.


I suspect that an unbalanced cell in a battery pack is bad for all the
cells
in the pack, as it's likely to generate more heat, and possibly result in
an
incomplete or overcharged state.


Yes, but ESR and imbalance are unrelated.


I believe that battery industry professionals consider ESR/IR to be a
significant factor in cell construction and later for end-user performance.


You *believe* that "they consider"? Hmmm?

I've seen a lot of inconsistent statements regarding battery maintenance
and performance.
One source that appears to be reliable is
http://www.buchmann.ca/default.asp


Pffft!

http://www.batteryuniversity.com/
Essentially the same info, written by the founder/inventor of the Cadex
equipment. I'm not just referring to info found in an afternoon's browsing.
I've been examining battery info/data for several years. As I stated
earlier, I mostly disregard much of the info in the hobby-type forums that
appear to be written by inexperienced kids.
I'm able to spot reports that lack scientific perspective (just opinions
based upon voodoo techniques).
I suspect that much of the older info doesn't relate very well to today's
newer cells.

The medium sized UPS gear we used (3x 300kVA/40minutes) had an
integrated real-time battery condition monitoring system. This simply
applied a known pulsed load to the battery systems (at multiple tap
points) and monitored the corresponding voltage steps, which gave a
direct indication of cell resistance. If any segment of a battery
string reached a threshold value, that string was taken out of service
and cell-by-cell testing was undertaken, follwed by any required
remedial action.


I wonder what action. A series of recharge/discharge cycles and retesting?
Reconditioning in-house, or sent out?


Replacement.

For UPS duty, problem batteries should probably just be replaced. The
potential value of a failure could be enormous.


Exactly. Without being specific, the fact that we had 3x 300kVA units
would suggest that our operation and its continuity were considered
more important than the cost of replacment cells. And before you ask,
suspect cells were replaced individually based on these tests, and
when the number of replacements reached a certain percentage of the
string the whole string was replaced.

You could set up a similar tester for manual operation (similar to
mike's description) and gain useful insight into internal resistance.

A light bulb and DVM tester could be a handy gage for initial checks.

My reason for asking about utilizing an ESR tester was because ESR testers
display an ESR reading, and readings are easily compared. The ESR tester
could likely be used on battery packs regardless of the pack's SOC state of
charge.

As Phil A pointed out, the Bob Parker ESR meter could be useful for battery
packs


yup

but I may need to find my milliohm meter to try on individual cells,
if it's capable.


A "normal" (milli-)ohm meter wont give you any useful indication from
a charged cell except maybe smoke signals. The Bob Parker ESR meter
(I use one) uses an AC excitation and measures the voltage across the
target device that results, effectively removing the DC component
which will generally smoke your ohm-meter

A poor IR reading on a new cell (compared to the other new cells from the
same purchase) would be something that I'd want to be aware of.


Indeed.