The REC users manual includes some detail on how the whole ADC converter is able to extrapolate and read current flow from your battery system. There is a calculation there explaining the math required to convert the digital steps to analog DC current, but it is kinda cryptic.
The BMS needs to know how many amps there are for every digital step of the Analog to Digital converter, and then we enter that value into the IOJA variable.
for a 100A/50mA shunt IOJA = 0.00390625
200A/50mA IOJA = 0.007812
500A/50mA IOJA = 0.0195313
Rick, thanks for this post. I purchased the \"REC Current Sensor Shunt 1000A\" and seem to have a slight misconfiguration of settings as the system is pessimistic enough that it resets SOC to 100 every few charges. e.g. - it thinks SOC is 30% but voltage would indicate closer to 60%.
I tried calibrating using ammeter but while closer, is not quite right. All loads go through the shunt.
I couldn\'t find specs for this shunt, is it a 50mV shunt or another value? Any information about what the IOJA should be set to for this shunt would be helpful. I have a feeling my IOJA is off, or there is some setting that measures charge differently than load.
Hi David,
There are many of factors that are used to calculate your SoC, so we need to look at more than just the current sensor.
First, grab a DC multi-meter and compare your BMS values vs the meter. If they are close that is all you need as the REC will likely be more accurate
Second, confirm your Battery Capacity CAPA value matches the value of your cell pack.
Third, make sure your pack is configured for the correct cell chemistry using the CHEM parameter
If all of these look good then you can fine tune you system to allow for your particular cell and wiring efficiencies. Because each installation is wired differently and depending on the grade and type of your cells they will charge and discharge with different losses. For example a lead acid battery discharges at around 60% efficiency vs a good LPF cell which will discharge at better than a 90% efficiency.
It may be that you have short runs with well sized wiring and good quality cells so just use the CHAC and DCHC coefficients to tweak the calculation to match your specific installation characteristics.
In addition the system takes into account the number of cycles and pack temperature in its calculation. Here is an excerpt from the manual.
Battery Pack SOC Determination:SOC is determined by integrating the charge in or out of the battery pack. Different Li-ion chemistries may be selected: Table 7: Li-ion chemistry designators.Number Type
1 Li-Po Kokam High power2 Li-Po Kokam High capacity
3 Winston/Thunder-Sky/GWL LiFePO4
4 A1235 Li-ion NMC/ LiMn2O4
Temperature and power correction coefficient are taken into consideration at the SOC calculation. Li-Po chemistry algorithms have an additional voltage to SOC regulation loop inside the algorithm. Actual cell capacity is recalculated by the number of the charging cycles as pointed out in the manufacturer’s datasheet.When BMS is connected to the battery pack for the first time, SOC is set to 50 %. SOC is reset to 100 % at the end of charging. Charging cycle is added if the coulomb counter had reached the battery pack’s capacity.
Rick, thanks so much for your reply and suggestions.
First, grab a DC multi-meter and compare your BMS values vs the meter.
Ok, values are close as far as voltage. Current moves around a lot, but does seem off when running high power (e.g. - lots of solar charge)
Second, confirm your Battery Capacity CAPA value matches the value of your cell pack.
Yes - matches
Third, make sure your pack is configured for the correct cell chemistry using the CHEM parameter
Yes, set to the LIFEPO4 setting. I\'m limiting charge voltage to ~90% of the manufacturer suggested voltage. Using EVE cells, which max at 3.65v but I\'m stopping charge at 3.5v.
If all of these look good then you can fine tune you system to allow for your particular cell and wiring efficiencies.
I do wonder about these settings! I\'m using 0.6 as charging coefficient and 1.0 for discharge coefficient (default). What exactly do these values do when I change them and how do they figure into the SOC calculation or data sent to the Victron Cerbo? If this means the charging current is discounted compared to loads, that would explain things.
Additionally, is it possible to confirm what the REC-shipped 1000A shunt specs are so I can sanity check the current sensor coefficient I\'m using?
Thank you again for your help.
By running under the manufacturers rating you potentially creating some issues. The default top balancing on a REC BMS starts at 3.5V so you will never start to top balance, unless you have changed those parameters as well. A while back there was a lot of chatter about not fully topping off lithium batteries but as I recall that was for LiPo cells in phones and electronics. I have run LFP cells at manufacturers suggested capacities for more than 5 years without issue or significant degradation. It all comes down to buying quality cell and CALB is at top of my list. You will likely need to adjust your balancing parameters and experiment or go back to the manufacturers suggested settings
For your charge coefficient, I think we found your problem. I would set it to 0.87 and 0.91 for your values and tweak it from there if needed. These values basically say that if 1 amp is going through your shunt that 0.87 amps will actually get stored as usable energy, and when under load, after taking into account line and cell energy conversion losses that when 0.91 amps is flowing thru your shunt, it will take 1 amp of energy from your cells.
Hope this helps!
A 1000A/50mV shunt will deliver a voltage drop across the two terminals of 0.05mV per Amp of current
Thank you Rick, very helpful. I\'ll adjust that charge coefficient parameter now, 60% charge efficiency is definitely too low for a lifepo4 system.
I did adjust the top balancing settings to ensure there was room for the cells to balance as they near the top of charge. This seems to work ok -- cells are staying in sync and I don\'t really have other issues to report beyond the SOC measurement issue. Fingers crossed, you helped find identify the problem and gave me some focus to help fine tune settings.
David,
I may be mistaken on how the SoC works with regards to CHAC and DCHC and am requesting clarification from head office.
When I hear back I will update this post.
Here is the official clarification from head office:
There are some good points here.
CHAR and DCHC are only to calculate maximum charge and discharge current - see charge and discharge algorithm in the manual. There is a temperature and current coefficient to calculate the SOC. Simple explanation: @ higher currents the cells do not provide full capacity. The higher the current the higher capacity. This is due to the internal resistance losses. Also @ lower temperatures the ions are slower and also internal resistance higher and cells are not able to provide the whole capacity.
If the End of charge and MIN Vcell are set differently/more conservatively than in the cells\' datasheet. Like 4.2 V end of charge and 3.0 V MIN Vcell for the Li-ion then the capacity entered should be usable capacity not nominal capacity. Say they enter MIN 3.3V and 4.1 V this means about 92-95% of nominal capacity. This should be entered by looking at the cell\'s datasheet and Capacity/SOC vs voltage graphs.
Also CLOW should be set above Min Vcell CMIN @ 3-5% usable capacity and balance voltage start BMIN @90% of the usable capacity. So the reset of the SOC to 3% or 95% works properly. Not only @ SOC but also @ BMIN and CLOW if these parameters are not set properly
@Richard the current reading on my REC-BMS is reading about 1/3 of the value that it should when compared to a high quality Fluke clamp meter. I am using these coefficients that you provided a while back for my 1000 amp and 2000 amp Victron smart shunts:
"a 1,000/50 shunt would be 0.0195313 x 500 / 1000 = 0.00976565 and 2,000/50 would be 0.0195313 x 500 / 2000 = 004882825"