OK. HERE IT IS. SAG. VOLTAGE. MORE

I’ve heard a lot the last couple of weeks regarding our top balance/bottom balance situation. Most of the online forum engineers did NOT skip a beat. They went off of current shunt balancers pretty much cold turkey, but immediately lunged to monitoring individual cell voltages for a “low voltage limit” as the holy grail of the NEW BMSpeak. It was awe inspiring to watch. The entire conversion from shunt balance to LV cell limits took what looked like 8 minutes. Several new forum topics were started on the new design and ONE guy is already taking orders for his NEW design. I had a bit of a private e-mail conversation with him to avoid embarrassing him and it became immediately clear he had no clue how a charger worked to charge a battery.

So today, after a weeks work on this opus, I’ve posted a 102 minute epic adventure on the website at http://evtv.me on how to charge a LiFePo4 battery, and what a constant current/constant voltage charge curve is. If you already know, FEEL FREE TO SKIP THIS ONE. It was boring even for me.

But what is lost in all this is the question I keep asking them, and they seem TOTALLY UNFAZED by it. You are monitoring for WHAT and you will do WHAT with the information if you find it?

The problem of course is that voltage is a function of current based on the internal resistance of the cell. And that internal resistance varies both ACROSS the entire discharge curve, and by temperature. So whatever voltage you set, will cause a different result at a dramatically different temperature. So design and test in December will not have the same results in June. And the REAL problem is the voltage at all. It sags at the beginning of the curve, but it sags MUCH more at the end of it. And at some discharge currents can drop to very low values with NOTHING wrong with the cell, in fact everything very very RIGHT with the cell, since it can deliver that kind of current.

So if I WANTED to set an alarm, and could magically do it at NO effort or expense, such that if ANY cell fell below some voltage, and simply set off an alarm, I would have NO idea what to set it at anyway. Or what to do with the information if it went off.

But I have gotten some new equipment and can do some pretty good current rates out of single cells. I did some discharging of a Thundersky 160 Ah at 300 amps with a duty cycle of 20 seconds at 300 amps discharge, followed by 20 seconds rest and repeated until full discharge. It appears this CAN cause a modest Peukert effect.

Today, I had a bit more productive fun with a THundersky 90Ah and a Sky Energy 100 Ah cell. Topped them both off quite nicely, and then did a discharge curve in an odd combination with the West Mountain automated load and writing down the values. I set it to discharge for 180 seconds and rest for 60. For both cells, that worked out to about 5% increments of total capacity for each 180 second pulse – 4.5AH for the Thundersky and 5.0 AH for the Sky Energy.

What I found was encouraging and discouraging. First, these little batteries are REMARKABLE devices in how much power and current they can spew for hours on end. Discouraging? Well, I had been working on a little Arduino board to drive the ancient fuel gage in the Speedster’s original combi gage. Had it working pretty good and just assumed that the voltage decline, while very slight, would be very linear and at least detectable. Wrong.

I’ve been spending way too much time on the ugly end of the discharge curve. Up at the GOOD end where all of you all want to be, the voltage differences are very small. Several people have told me you cannot monitor pack state of charge by voltage. I pretty much blew them off. The voltage changes with discharge, I’ve got meters. I can measure it.

Well, I STAND CORRECTED. They were precisely correct. UP in the good part of the curve, there is so negligible a change in voltage that it is just not really very useful. I will have to do AH counting with the Arduino instead of simple voltage measurements. No fuel gage based on voltage will work.

Why did I think it would? Well down on the ugly end, if you can get unloaded for a minute and check your pack voltage, that’s a pretty GOOD indicator down there. When it goes below 3.00 v static, you are pretty much done. And I’ve always said that there wasn’t much left after that.

Well how TRUE TRUE TRUE. And with good reason. Your static voltage will dip below 3.00 v at about the 95% discharge level. And there really ISN’T any left. But recall that we get 2000 cycles at 80% DOD and 3000 cycles at 70% DOD. Up on THAT part of the curve, voltage is essentially useless.

See the graphs below. They tell the story.

I guess the only other interesting thing is that the Sky Energy cells, which I had said seemed to have a flatter discharge curve, seemed that way because they have a flatter discharge curve. Actually quite a bit flatter.

So a couple of things to learn from this. 1. Current load causes dramatic changes in voltage, but they are almost impossible to relate to useful information. They are just going to sag under heavy currents. These graphs are at 1C. It is much worse at 3C or 4C.

2. A fuel gage for an electric car using LiFePo4 cells, that is useful in the upper range of the battery and useful for detecting either 70% discharge or 80% depth of discharge, can NOT use voltage as a metric. It MUST count actual Ampere Hour usage. I know of no other way.

Enjoy.

Jack Rickard

http://EVTV.me

34 thoughts on “OK. HERE IT IS. SAG. VOLTAGE. MORE”

  1. This is great stuff. (I’m Steve from Los Angeles, the guy who asked about sagging below 2.5 per cell last night.)

    So two months ago, perhaps at a 3C or 4C draw on my hill, at around a point comparable to the 30% discharge bar on your Thundersky graph, my static voltage per cell would have looked like 3.25 and my voltage per cell under load would have looked like 2.87 volts.

    But ever since a day of severe battery abuse where I drove the 60AH batteries for 76 amp hours, the same static voltage level of 3.25 would now be associated with a voltage per cell under load of 2.5 volts.

    The temperature 2 months ago might have been 20 or 30 degrees fahrenheit warmer. Would that account for all the difference? Or could all of the batteries have degraded somewhat from the abude so they now have less amp hours to offer? Or could one battery be bad, and disrupting the others?

    Thans,

    Steve

  2. PS!

    Half hour into your video: If my 60AH Thunderskies are rated to go down to 2.5 volts when drawing 30 amps, and I need, let’s say, 120 amps to get up my hill at 45 mph at the end of the day, is there a formula that I can use to determine how many volts I can go down to when drawing that 120 amps (I assume it would be some voltage level less than 2.5)?

    Thanks,

    Steve

  3. Keep in mind the manufacturer uses a universal power rating. C or in this case, .3 c is the normal or graphed discharge and charge curve. So for your 60 ah battery .3 c is 18 Amps draw (or charge.) Make sure you factor in your actual C in this case you 60 ah batteries = 60 amps = 1C he was using larger batteries (100 AH) so his .3c is 30 amps. So, assuming TS used .3C for their graphs you should be good at 18amps draw to 2.5V. Or, get an amp usage meter and make sure not to pull more than 42 amp hours (70%)from your cells for long life (at less than 3C where TS are rated.) 30 amps for your cells is .5 C so you would have more (but not much) if you hit 2.5V at .5C time to be close to home however.

  4. On their website Thundersky had a chart showing .5 C standard discharge rate. But the real question is, if you know the rate (whatever it is), and you know how low the voltage should go at that rate, is it possible to extract how low the voltage should go for other rates?

    Also, at the beginning of the video, Jack explains what happens chemically inside a LiFePo battery when it is over-discharged. However, I didn’t catch anything from him in terms of what happens to over-discharged batteries in terms of symptoms — for instance, what is the behavior of the batteries that don’t get completely killed by over-discharge…

    Steve

  5. First: merry christmas to the whole “Rickard-Team”.

    I just saw your last video about charging (finally got some time to watch it).
    I want to thank you for afreshing that issue and gave us (me) some basic practical knowledge about charging and why you are so unhappy with the “upper voltage cutoff and burn energy”-thing.

    I was torn about this issue since I started my conversion a few mounth ago, now I take my “Goodrum/Fechter” an get it of my conversion.

    Your dissertation gets me the last affirmation off my thoughts.

    Thanks and great xmas,
    Michael
    http://e-vw.blogspot.com/

  6. I guess the important thing to understand about low voltage cutoff is that it IS dependent on current load. When TS or SE talks about a cutoff voltage, they are referring to the voltage at the “standard discharge rate” on the sheet.

    There is a formula for other currents, but it is essentially useless. If you measure your internal resistance by comparing voltages at two currents, you can then apply that to other currents. For example, if you read 3.3 volts at no load, and 3.1 volts at 100 amps, you can take the difference in voltage (0.2v) and divide it by the current (100 amps) to determine internal resistance (in this case 2 milliohms). Then at 300 amps you would see a 0.6 v difference or 2.7 v with the cell fully charged and at 500 amps you should see 2.3 vdc.

    This is why I say monitor for WHAT? A 2.3vdc cell voltage would be PERFECTLY NORMAL for a FULLY CHARGED cell in the Speedster at 75 mph.

    Why is the formula not very useful? The internal resistance CHANGES as the battery discharges. It also CHANGES with temperature. And these changes are not noise level changes, they are as large as the original resistance. You can SEE this in the difference in voltages on the graph as we get up to about 90% the SAME current load causes a much LARGER dip in voltage than it does at 10% discharge.

    Which in fact IS my point. Monitor FOR WHAT? What voltage are you going to set this magic alarm for? And what are you going to do with the information? Under what I see discussed, the BMS would cut off my car at 10% DOD if I go over 60 mph. Why would I want that? And if my static voltage goes to 2.7vdc and I am seconds from losing the entire pack, it won’t go off at all?

    The urgency to DO something to protect the admittedly not inconsequential investment in the battery pack is driving people to do stupid and expensive things, that not only offer NO protection, but might actually harm their vehicles.

    Get a good AH counter. We use EVision and we’re testing TBS. The batteries will put out all the current you want, but when the AH count indicates you’ve gotten your 80%, be home.

    Steve, you can’t get 16 pounds of potatoes out of a 10 pound bag. It is true that some of these cells come in a little over spec. But if you take 76 AH out of a 60AH cell, yeah, you’re going to hurt it. It will never be the same. I’m amazed you didn’t kill it outright. But you probably will never get the capacity of the original cell again. They will have to be replaced.

    They do not like to be over charged, and in fact prefer to be undercharged. They do not like to be overdischarged, and in fact will die if you do this. Undercharge a little, and then limit them to 70 or 80% by AH count. They’ll last essentially forever.

    Jack Rickard

  7. Thanks Jack.

    I have a cycle analyst on the bike and can use that to make sure I never use up more than 80% of the 60 AH (which would be 48 AH) in a single cycle.

    From what you have said it sounds like I might have to replace all the batteries. I already replaced 2 dead ones and am in the process of replacing one more which is bloated and leaking fluid and showing much steeper discharge and charge curves than the other batteries.

    I am kind of hoping against hope that replacement of that leaky battery might still bring everything back to normal (or reasonably close to normal)…

    Steve

  8. Jack,

    I’d like the opportunity to engage you thoughtfully without being labeled a BMS zealot. I do not claim to know the right answer but I am interested in your reaction to my thoughts.

    I am motivated to come up with methods I can use to protect my battery investment. You asked what should the low voltage setpoint be and what do you do with this information. Well, here is one option:

    As with all options it is important to set the assumptions for how the batteries used. I plan to use 100 Ahr Skyenergy cells in a 100 cell series string using a high voltage controller. Maximum current out of the battery is limited to ~200 Amps by the controller. I would set the cell low voltage (LV) setpoint to about 2.5 volts. The trip would trigger a warning light on the dash (maybe a chime to get me to look at the dash).

    Here is the basis of my strategy:

    During normal operation, including maximum throttle, in the first 70 percent of battery discharge I do not expect to trip the LV. If the LV trip is actuated during this phase I now know I have a problem. The problem could be a weak cell, an unbalanced cell or some other problem. My reaction to this alarm would be somewhat instinctual; take your foot off the throttle as soon as you can! If you’re in the middle of merging on the highway nothing is stopping you from staying on the throttle until you decide when enough is enough. The alarm must clear when voltage goes back above the LV setpoint.

    What about the far end of the discharge curve? You might ask:

    Here is where this strategy may save one or more battery cells! As you discharge further the voltage naturally droops more for a given discharge rate. So, you may get a LV warning at 100 amps. What should your instinct be? Back off on the throttle! So now you know, since you weren’t even close to full throttle, you’re getting closer and closer to the trailing edge of your battery’s capacity. As more energy gets used by driving a little further you get the warning light at less and less throttle. Now you know you really should go easy on it and REALLY SOON get those batteries charged up.

    As you can easily deduce I don’t propose this is the right strategy to gauge the battery capacity!

    Also, being this is really just a strategy, not to be considered a hard and fast rule, The alarm setpoint may be 2.6v or 2.4v. The LV setpoint would be unique for each application and could be reset if the warning light comes on too early or to late.

    By using this strategy and adjusting your in situ driving in response to the warning you should be able to prevent catastrophic cell damage.

    What about charging?

    I expect my charger to be able to deliver about 15 amp at pack voltage. The basic charge strategy is to charge at max current until pack voltage reaches 350 volts (3.5 volts/cell) then shift to constant voltage (350 volts) until current tapers to 5 amps then stop the charge. I propose to set the high voltage switch to an alarm at ~3.65 volts as this would indicate a cell out of synch with the rest and requires some attention by the owner.

    This strategy is based on the modest charge current that would occur on a high voltage pack.

    What do you think Jack?

    Respectfully,
    Joe

  9. I am lwalth, I love your electric car, I am very much a fan of electric cars and the people that build them. I have built many Hot Rods, But never an electric one. I would have liked to build one, but I am a disabled Vet. and I am happy to just be able to walk, I hope you keep building the cars and posting the Information. It is great and it helps the people like me who would like to build one, just do know how. Keep the good work up, I like your site too, It looks very good. I would like to share your posting, on our blog site, and you could share our posting if you would like to. We are New but have a very nice Blog site, I have a friend who does a great job. If you are wanting to share please contact us at http://electricdemise.blogspot.com/ please come and visit or at least let us know what you think.

  10. Joe:

    I think you’re in way too deep with this. But if that’s what you want, on the charging side it will most likely work. If you are using SE cells and charging to a series voltage representing 3.5 v per cell, an alarm at 3.65 would be great. It will never go off, but it would be great if it did.

    On the LV side, it won’t work. At 200 amps your cells WILL sag below 2.5 v ALL of them and almost immediately. In colder weather, they will sag even more. This is when they are fully charged. So you’re alarm will be going off any time you climb a hill or accelerate briskly.

    So if it ALWAYS goes off, what do you do with the information? Please see our January 15 video on this topic. Really cheap little Chinese battery monitor should satisfy your craving to DO something about your batteries. You can log the data and pour over it in the evening at home until all hours of the night.

    Jack Rickard

  11. Jack,

    Since I don’t have the test equipment and a Skyenergy cell to test I chose 2.5v based on your test data which seems to show ~0.15v drop from no-load for a 100 amp load when less than 70% discharged. It seems reasonable to presume that the 200 amp load would yield ~0.30v drop from no-load.

    If the above assumption holds even approximately true then the loaded voltage would be about 2.94v at about 70% discharge.

    Why do you say it would go below 2.5v at 200Amps? Barring the cold weather, am I missing something? Help me out here with some explanation please.

    It seems that 2.95v expected verses a 2.5v alarm is a big enough gap to only alarm when there is a problem or a cell is getting really low.

    Frankly, the strategy is to set the LV alarm such that it would not energize during normal operations; only when there is a problem.

    This strategy is based on the reality that I won’t go pouring over data. I just want a little warning that if I continue to drive I risk damaging a cell.

    Thanks for engaging on this,

    Joe Lyon

  12. Jack,

    I watched your video. I do like the inexpensive battery monitors you showed us the high and low setpoints seem to apply directly to my proposed strategy. I do admire how thorough you are with the testing you have done. It appears to support my strategy when you take into account the limitation on the maximum charge/discharge rates when the cells are in the car.

    If at some point you happen to be load testing one of the 100 Amp-hr Skyenergy cell would you please grab a voltage data point at 200Amps. Thanks in advance,

    Joe Lyon

  13. First: thanks so much again, Jack, for doing all this work and most importantly for sharing it. No matter what anyone thinks of your conclusions, sharing all your source material makes makes your conclusions so much more useful than something one cannot analyze for ones self.

    Joe, I think what you are talking about is similar to what I have been putting together in my mind. Using off the shelf BMS components, use the alarm to run a light, not a cutoff; set it a little high, so that it starts to come on during high amperage events even at higher states of charge, and your brain would just have to figure out when half throttle was causing it, calibrating your brain with the ammeter.
    So starting out, it would take 3c to light the light, but when you got to where 1c or.5c was lighting the light, you would start worrying about a cell being of low capacity/out of sync.
    The light would be on a lot, but when it started being on constantly, it would become more obvious.
    In my opinion, the crux is this:
    in all of the solar systems I have built, and it seems to me even more important in a car, you NEED an accurate amhour meter to keep use below the pack damaging point.
    Trying to use voltage to determine whole pack state of charge is a waste of time, no matter what kind of batteries you are using. With LA, it was because of peukert’s equation, with TS or SE batteries, paradoxically, it is because of the otherwise very desirable flat output curve. I just can’t help wanting something that will warn me of something weird happening in one cell.
    Maybe someone can come up with a warning system that calibrates the voltage signal to the amps being used? Might even be as simple as something that turns on the warning system only at zero amp draw so as to only warn about “resting” voltage. The quotation marks are to indicate that I understand that resting voltage involves some time at rest and that in a driving situation, that time would be very short.
    It could probably be accomplished by simply having a coil around the pack wire that generates voltage when there is currebt passing through that wire,which would hold open a relay, preferably a SSR. You could easily calibrate it to come on only when no current was being used.
    Hmm, some testing is in order. Unfortunately I have to build the car first 😉
    Anton

  14. I agree with you Anton. I really don’t have a problem driving the car. The problem would be letting my DAUGHTER drive the car. My brain calibrated to a mysterious light that comes on sometimes, and not others, isn’t going to clue her to do anything.

    What I’m basically working on is a little Arduino microcontroller with a LEM HASS 200-S hall effect current measurement device. It will count AH in and AH out. It will also drive two relays. One will cut the throttle to 1/4 of its normal output. The other will disable the controller.

    And it will run a normal automotive fuel gage.

    Her normal operation would be to ignore all lights and fuel gages entirely, and just drive the car. If she drove it past 80%DOD, this device will cut in the throttle relay. The car will then go about 25 mph with max pedal. That would be a big clue to her that something is wrong.

    At 90% DOD, it would disable the controller. Clue 2.

    I’m frankly just not very worried about a sudden “failure” of a cell. Maybe I should be. But I’m not. I’m concerned with not overdischarging teh cells.

    It’s ok to do so on the bench and on a video.

    Not ok in my car.

    Jack RIckard

  15. Anton and Jack,

    Thanks for the feedback. I may be giving the wrong impression of the strategy I propose. I’d like a low cell alarm which ONLY turns on when any cell voltage is abnormally low. At 200 amps load (max throttle for me) in moderate temps I do NOT expect voltage to go below 2.5 volts in normal driving (<75%dod).

    The alarm should never turn on while driving. No throttle calibrating feedback that Anton references UNLESS a cell is nearly dead. That is more nuance than the average Joe cares about.

    For other drivers of the car, I’d call it the low battery alarm. They knows the car is powered by batteries and that getting the alarm is bad (not normal) and they must charge the car as soon as possible if they get the alarm.

    Anyone who has a 100 Ahr Skyenergy cell could help determine if cell voltage drops below 2.5v under a 200 amp load in moderate temps. Help me out please!!

    I completely agree that if it does drop below about 2.5v at 200amps then my stategy is junked. Then Anton’s strategy may be the closest thing to what I’m proposing. I would not be satisfied with that due to the lieklyhood of becoming desensitized to the alarm. Like Jack inferred, I want something to reduce the probability of damage to the batteries by everyone who drives it.

    One problem with having a disabling strategy before a warning light is that the car may become unnecesssarily disabled just when full power is needed.

    I see a new episode is available online, I can’t wait to view it! Please, keep up the great work Jack!

    Respectfully,

    Joe Lyon

  16. Thanks Joe. Yes, unfortunately this is the thing I do not appear to be communicating very well.

    Your cell voltage at 200 amps. That will kind of depend on a confusing array of things.

    1. What type of cell.
    2. What temperature is the cell.
    3. Where in the discharge curve NOW is the cell at.

    In answer to your question, yes, it can fall below 2.5 v at 200 amps. And it can do that with everything quite “normal” as far as normal goes.
    It can also do it when a cell is failing.

    Which brings me to the question, monitor for WHAT and WHAT are you going to do with the information.

    If it goes off to often at 2.5v, you will learn to ignore it. But then why have it?

    Bottom line is we have to develop circuits to do coulomb counting of AH in and out at a minimum. Ideally, that should be factored in with about a half dozen other things, but at a minimum AH in/out.

    So we’ve got a fuel gage and a BMS all mashed up into one confusing mess.

    I’ll have to work on this some more….

    Jack Rickard

  17. Hey, they’re YOUR batteries and your investment. I’m not against the concept of a monitoring system. I haven’t found one that does anything useful for ME.

    We’ve tried to show what happens to a battery under load, but it gets a little difficult on the bench to do 500 amps without burning something up. Even at 200 amps I’ve melted cables, had one shunt in a plastic enclosure melt the whole BOX, etc. Yet, in the car it is easy to do that high a current.

    The closest thing I’ve found to a useful battery monitoring device was this CELL LOG 8S from http://www.hobbyking.com. The thing is $29 and it actually logs data you can view later on a laptop. Each unit will monitor 8 cells. If you want an alarm, I would suggest using the DIFFERENCE alarm feature of this unit. It will go off when any cells voltage is DIFFERENT than the other 7 by a set amount. Set it at two tenths of a volt. They shouldn’t vary by more than that ever unless there is a problem with that cell.

    In this way, you don’t care what the absolute voltate is, but if the relative voltage varies by more than 0.2v across the cells under ANY load you probably have a problem.

    Jack Rickard

  18. Jack,
    Thank you for all the good information here. I have been doing quite a bit of research on BMS systems as I am getting ready to switch my car from lead to LFP. I am curious with these systems do people usually monitor all cells in the pack or just selected ones? Thanks

  19. Joe said….

    Jack,

    Strange you’d say:
    “That will kind of depend on a confusing array of things.
    1. What type of cell.
    2. What temperature is the cell.
    3. Where in the discharge curve NOW is the cell at.”

    I spelled all that out. I’m talking about the same 100 Ahr Skyenergy cell you’ve been talking about and testing. Moderate temperatures (50-70F), < 75% discharged and 200 Amps. I’d hope there’d be no confusion about the 3
    basic conditions I’ve laid out.

    I thought you might have a 200 amp data point similar to your graphs above to enable a good estimate. I’m not fixated on 2.5v, I ballparked that number base on an extrapolation of your data.

    I didn’t know you’ve been unable to test current that high. I’ll see if I can come up with a test apparatus, buy a battery and try it myself.

    Your Fan,

    Joe Lyon

  20. Jack,
    I’ve been working on my own buck charger for a few months now and have had noise problems on the voltage measurements. I’ve solved this by turning off the pwm for a few milliseconds to take a reading then returning pwm to its old value. What this means in a sense is that i’m reading the pack voltage at zero charge current. Would you say that thats a good way to determine cuttoff point? I have found in my own experiments that , as you have demonstrated , pack voltage is determined by charger current. I’ve also noted just today that charging to a slightly lower voltage allown the batteries headroom for tolerances.
    Damien

  21. Hi

    Maybe this is of interest: a RAV driver clames on her blog that “I don’t hesitate to drive the RAV all the way down to the bottom if we need to, because we know precisely how far we can go given we have RAV Info, an after market Palm Pilot program that taps into the RAV’s main CPU where all the car’s info is. We can see to a tenth of a percent how much SOC we have.”

    Does anyone know how this works?

    Cheers

    Nick

  22. Ok Jack,
    finally, i think the best way for monitoring
    problems seems to be to compare every single cell with the average voltage of the whole pack. I like this idea! That might be a good starting point for a reliable warning system.
    Nevertheless this must be calibrated for every cell at the “high” level hmmmmm……

    Franz

  23. Looking at the graphs you have there Jack, at least between 5% and 70% dod, the decrease in voltage at rest on the TS cell is quite linear. It follows therefore that you could use something along the lines of a 10 LED bargraph display (as shown here: http://www.evconvert.com/article/led-bargraph-battery-monitor-part-2#c003260 ) calibrated to show empty at 70% dod and full at less than 5% dod to give a reasonably safe indication of state of charge at rest. I would have thought that using one of these per 2 cells in an array would give an immediate indication of any cell going south faster than others as well as state of charge of the whole pack, along with general state of balance at a glance.

    Your thoughts?

    Regards

    Paul

  24. Jack,

    I didn’t see any where that you noted any temperature change of the cell during these tests. Have you taken a cell, insulated it, and then given it a 1C discharge and monitored temperature change over time? This would be helpful info to get some idea of what type of cooling would be needed. Thanks.

    David

  25. Great video, long & boring though it may be.

    During charging, why even worry about constant current? As long as your charger isn’t capable of supplying the maximum charging current specified for the battery and more than .05C. Just set the charger to a set voltage, 3.5v per cell in this case, and charge until the current drops to .05C then stop the charging.

    Yes, the current limit of the charger can be looked at as constant current but to charge the batteries as fast as possible the current should be set as high as possible.

    In a pack though, lower charging currents may allow better self-balancing between the cells… I don’t know. -Klaus

  26. Klaus:

    If you change the current, you change the voltage. It is important that it maintains a CONSTANT current so you can watch the voltage rise as the SOC on the pack comes up. If it is NOT constant current, then the voltage is meaningless.

    With a constant current source, you add energy up TO your switch point. At that point, you switch from a constant current source to a constant voltage and maintain that voltage precisely. As the energy level of the cell CONTINUES to rise, the amount of current necessary to hold that voltage will gradually diminish. Once it reaches some nominal low level, you discontinue the charge process – terminate. Typically this low level is 0.05C – 5 amps on a 100 Ah cell.

  27. Jack:

    Thanks for helping me understand this. But I still don’t…

    I understand that if I do not maintain a constant current I will not see the voltage rise as the cell charges. But, as your experiment showed and you explained above, if you maintain a constant voltage, the current will decrease as the battery charges. An equally good indicator of SOC?

    As you said, a full charged cell is defined as ~0.05C at, in your case, 3.5 volts. I say do a constant 3.5v charge, without exceeding the maximum charging current for the cell, and when you reach 0.05C it’s fully charged. That’s it, you are done and the cell is fully charged. SOC at any point while charging would be inversely proportional to charging current.

    In the end, the product of current and time tell you how many Ah’s were put back into the cell.

    Does that make sense or am I just _not_ getting it? -Klaus

  28. Okay, so given a 100 Ah cell it doesn’t care if I CC charge it at 300A (3C, the max charge current), 200A or 5.01A and switch to CV charging at 3.5v and stop charging when the current drops to 5A. Either way it’s fully charged, right?

    Sorry, I’m trying to get it ;o) -Klaus

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Verified by MonsterInsights