Ideal and practical lead-acid charging parameters

Forking from this thread

Observed values:

  • Boost: 14.2 +/- 0.05 for 2 hours
  • Float: 13.8 +/- 0.05 for the remaining 3.5 hours of sufficient illumination

From the charge controller’s manual, the default values are;

At 12v, I believe this would yield -18mV/°C.

The battery is a valve regulated pure gel 12v 100ah lead acid battery for which the manufacturer provides this charging guidance;

image

I’d be happy to hear any thoughts, critiques, or suggestions!

Your voltages are too low by a good bit, and it shows in that your battery isn’t fully charged when it drops to float - it should, ideally, have a flat current curve once you trip down. Yours continues the exponential decay, and while it might be flattening out by 1500, I’m not at all convinced the battery is fully charged.

Your battery is 10C, temperature-wise, which means charging voltage, compensated, should be (with the -30mV/C slope) should be {14.1-14.4} - (15 * -0.03) = {14.55-14.85V}, and higher when colder. At 0C, that should be 14.85V-15.15V.

Unfortunately, it doesn’t look like your controller will let things go that high, which is no surprise.

And it looks like the boost duration is too short too, but look at the curves after you fix the voltage.

https://syonyk.blogspot.com/2018/04/off-grid-rv-lead-acid-maintenance-charging-failure-modes.html is my epic on battery maintenance, which is heavily biased towards flooded lead acid, but applies to other types as well.

In there, you can see a current curve for my system, as I’ve set it up:

In the winter, I’m over 60V in absorb on a cold bank, and I think I’ve seen 62V (so 15.5V on a 12V bank).

I read that post a long time back, but I guess my retention wasn’t so good… :slight_smile:

Just to make sure I’m following, it sounds like I need to troubleshoot why the voltage isn’t higher given that the controller is supposed to be temperature compensating. Once I’ve done that, I should consider expanding the boost duration out as far as I can for the winter. For Spring/Summer I should re-evaluate and possibly limit the boost charging to around the amount of time it takes for the charging current to approach zero. (It looks like that was happening around 14:30 - T+5H over the summer, but I’ve doubled the number of panels since then.)

Roughly, yes. And maybe use some custom settings if you can.

Does the controller have a separate temperature probe? If not, you might be better off just compensating a few times a year manually. My charge controllers (inside my office) have nothing to do with the battery temperature, and the battery temperature has remarkably little to do with ambient. There’s a lot of thermal mass in my bank.

Yes, and I’ve got it taped to the side of the battery in a totally legitimate battery box.

The charge controller reports 4 different temperatures 2 of which relate to the battery;

The “Battery Temperature” one reports a reasonable value which is different than the case and heat-sink temperatures - that’s the one which I provided a graph of above. I’ll test tomorrow whether it’s reading from the probe by warming the probe with my hand and confirming the temperature approaches 37°C.

The “Remote Battery Temperature” has always read as 0°C on all 4 of the charge controllers I have in the field regardless of whether the temperature probe is connected. I’ve assumed that was an unused register in the charge controller firmware, but perhaps I have to troubleshoot that somehow.

I should be able to do better than that… I’m using an ESP8266 to capture the charge controller data and I should be able to program it to also modify the user settings over time. Presumably, I could bully it into charging at whatever voltages - within the allowed ranges - I want for as long as I want by setting the boost/equalize duration to zero minutes and tweaking the float voltage setting up and down at the appropriate times from the ESP8266.

I’ll probably experiment a bit more to see whether I can get the charge controller to do the right thing first before I go that brute force route though since I was pretty happy with the idea of my ESP software not being in the critical path to keeping the batteries healthy… :slight_smile:

I would agree that putting an ESP in the critical path is probably a bad idea. But the custom settings fields should let you configure it to do what you want. Without end amps, it’s hard to really tell when to terminate.

One other thing you might consider - during the summer, when you do have long solar days, experiment with just running float voltage all day. I can mostly charge my batteries during the summer with an absorb cycle ever few days, and it keeps them a lot cooler if they’re charging at lower voltages.

1 Like