What is battery power monitoring technology?
Meaning: Battery power monitoring is a technology used to predict battery capacity under all system operating and idle conditions.
Battery capacity:
percentage
-Running time
Time until battery is depleted/fully charged
milliampere-hours (mAh)
watt-hour (Wh)
Call duration, time limit, etc.
Additional data can be obtained to reflect battery health and for safety diagnostics.
1. Health status
2. Full charge capacity
Battery power monitoring technology is primarily used to report battery capacity, and it can also generally provide information on battery health status and full charge capacity.
Overview
Basic knowledge of battery chemical composition
Traditional battery power monitoring methods
Based on voltage
Coulomb counting
Impedance tracking technology and its advantages
Part 1: Basic Knowledge of Battery Chemistry
First, I'll introduce some knowledge about battery chemical components that are related to battery power measurement.
By varying the shutdown voltage according to the discharge rate, temperature, and aging condition, the battery can provide the longest possible operating time. From these graphs, we can first see that the battery voltage drops rapidly under low current conditions at room temperature. Although the system can support a minimum voltage of 0.5 or 0.5, the voltage drops quickly after that. To prevent data loss or sudden interruption of the loading circuit due to sudden shutdown, customer applications typically tend to set the battery's minimum capacity reference point to zero. If this reference point is still used under low temperature, high current, or very aged battery conditions, the usable battery capacity will be greatly reduced. As you can see from these curves, under high current conditions, the discharge curve is almost at zero from the start. The same applies under aging or low temperature conditions. Therefore, if 0.5 is fixed as the zero capacity reference point, the reported capacity will be reduced under low temperature, high current, or near-aging conditions. To prevent this, the battery capacity must be adjusted according to temperature, discharge rate, and battery aging.
The high-current discharge capability of lithium-ion batteries can be adjusted over a wide range by using thicker or thinner layers of active material. A thinner active material layer means the battery has a higher high-current discharge capability, but a lower energy density. The standard 18650 cylindrical batteries used in laptops are designed to achieve maximum C-rate discharge. However, some batteries are rated for 10C discharge (for portable power tools), and some can even reach 60C discharge rate (for backup power/regenerative braking in hybrid electric vehicles).
High-current discharge capability will be severely reduced at temperatures below 0°C due to the low conductivity of organic electrolytes. Since the conductivity of electrolytes varies, it is important to consult the manufacturer's data regarding low-temperature discharge.
Battery chemical capacity Qmax
In battery power monitoring technology, one important concept is the battery's chemical capacity, Qmax.
In this graph, the value corresponding to the intersection of the red curve and the horizontal axis is Qmax.
This curve was measured with a load current of [value missing]. To measure Qmax, the load current must be sufficiently small. Theoretically, Qmax refers to the capacity that can be released when the current approaches zero. However, in practice, engineering techniques use a very small current to measure Qmax; here, we used a [value missing] current. Why is that?
In battery power management, the concept of "C" refers to the battery's discharge rate. 1C, specifically, if a battery has a capacity of 2200mAh, a discharge current of 2200mA is 1C. Therefore, conceptually, it's the current required to completely discharge one battery in one hour. Thus, a 2200mAh battery corresponds to a discharge current of 2200mA, and a 1100mA battery corresponds to a discharge current of 1100mA.
In portable applications, a crucial question regarding battery functionality is how long it can operate continuously. This is determined by the amount of active material, its specific capacity, and voltage characteristics. As a battery discharges, its voltage gradually decreases until it reaches the minimum acceptable voltage for the device (known as the discharge termination voltage [EDV]), which is the voltage at which further discharge would damage the battery. By integrating the charge transferred during the discharge process, we can measure the capacity Qmax that can be discharged before reaching the EDV. The voltage curve during the discharge of a low-rate lithium-ion battery is illustrated above.
Available capacity Quse
Another corresponding capacity is the usable capacity. Since we just discussed the battery's chemical capacity, which is measured at very low current, it's largely determined by the battery's inherent characteristics. However, in actual battery use, not all of this capacity can be discharged. Due to the discharge current, the discharge curve will be lower than the open-circuit voltage curve. As you can see in this curve, due to the battery's internal resistance, the actual discharge curve is the blue curve. The value corresponding to the blue and red curves is Quse, which refers to the battery's usable capacity. In this curve, we see that the internal resistance shifts the curve downwards, meaning the discharge termination voltage (EDV) is reached earlier. Therefore, Quse is generally less than Qmax.
This curve also shows that the larger the current, the smaller the Quse. In this curve, I*Rbat refers to the decrease in battery terminal voltage caused by the presence of internal resistance.
Battery resistance
The internal resistance of a battery has a significant impact on battery voltage monitoring. A basic formula can be used to express the effect of internal resistance on battery charge monitoring: V = Vocv - I * Rbat
In this formula, Vocv refers to the open-circuit voltage of the battery, I refers to the charging and discharging current, Rbat refers to the internal resistance of the battery, and V refers to the terminal voltage of the battery.
Battery impedance is actually affected by many factors, including ambient temperature, battery capacity percentage, and battery aging. It is a very complex function among these variables. Obtaining a specific expression for this function is very difficult, so impedance is often obtained through actual measurement, specifically using a differential table. The internal resistance of a battery typically doubles after 100 charge-discharge cycles; this is a relevant empirical value. The deviation between batteries in the same batch, if well controlled, can be kept to around 10-15%. However, the deviation in internal resistance between batteries from different manufacturers is often greater. Therefore, battery internal resistance is a variable that is very difficult to control during production, and it is also a very important variable.
State of charge (SOC)
We just talked about SOC, which actually refers to the percentage of capacity, the percentage displayed on the corner of the screen when using a mobile phone or tablet. The capacity percentage indicates how much charge a battery has remaining between a certain state and when it's completely discharged. The abbreviation SOC stands for State of Charge, so it can also be directly translated as charge state, since Charge refers to electrical charge. Obviously, for a fully charged battery, the voltage percentage, or state of charge, equals 1; for a completely discharged battery, the voltage percentage equals 0. Therefore, the formula for voltage percentage, SOC, is equal to Q (the remaining capacity corresponding to state A) divided by the battery's chemical capacity Qmax.
A concept corresponding to the battery percentage is DOD, which stands for Depth of Discharge. Obviously, when the charge percentage or capacity percentage is 1%, the depth of discharge should be 0; conversely, when the capacity percentage is 0, the depth of discharge should be 1.
We encounter the concept of DOD in many TI documents. DOD is actually a relative concept to SOC. They mean the same thing: how much charge is left in the battery, or how much charge the battery has discharged from a fully charged state.
Resistance is related to temperature and DOD
The battery impedance is significantly affected by temperature and capacity percentage, which can also be represented by the depth of discharge (DOD). This curve shows some basic trends: the higher the discharge percentage and the greater the depth of discharge, the greater the battery's internal resistance. The vertical axis of the curve represents the battery's internal resistance in ohms, and the horizontal axis represents the discharge percentage (DOD). Different colored curves represent data measured at different temperatures. Clearly, at the same temperature, a higher discharge percentage (deeper discharge) results in greater internal resistance. Furthermore, this graph also shows that for the same DOD (capacity percentage), lower temperatures lead to higher internal resistance. This is a fundamental concept, a basic understanding of batteries that everyone should grasp.
Impedance and capacitance change with aging
Besides temperature and capacity percentage, another significant factor affecting battery internal resistance is its lifespan, or degree of aging. Generally, after 100 charge-discharge cycles, a battery's chemical capacity decreases by 3-5%. While this capacity reduction isn't very significant, the impedance change is quite noticeable; after 100 charge-discharge cycles, the impedance can almost double. You can see this in these two graphs. The left graph shows the discharge curves for the 1st and 100th cycles side-by-side. It shows that increasing the number of discharge cycles doesn't significantly reduce capacity, but increasing the discharge rate greatly affects internal resistance. The right graph shows the relationship between battery internal resistance and the number of discharge cycles. It contains multiple curves; the horizontal axis represents the frequency used to measure battery internal resistance, and the vertical axis represents the battery's internal resistance.
This graph shows that at very low frequencies, the bottom curve represents the first measurement at a different frequency, while the top curve represents the battery's internal resistance measured at the 100th measurement at a different frequency. The intersection of these two curves on the vertical axis differs by approximately double, indicating that after 100 cycles, the battery's internal resistance doubles. The horizontal axis here uses frequency to represent internal resistance, meaning that at very low frequencies, the change in internal resistance is significant with increasing cycle count. Conversely, as the frequency increases, for example, when the test load frequency increases to 1kHz, the change in internal resistance becomes negligible. You can see that many curves converge at a single point. So, what kind of impedance actually has the greatest impact on our battery power monitoring?
The impedance is measured at relatively low frequencies or under DC conditions. Therefore, we should look at the two intersections of the curve and the vertical axis in the graph on the right. From these intersections, we can see the effect of the number of cycles on the DC internal resistance of the battery.
Impedance difference of the new battery
This diagram illustrates the impedance differences of new batteries. The battery's manufacturing process involves stacking or rolling layers together, so from the outside, the positive and negative terminals exhibit capacitive, resistive, and inductive characteristics. Therefore, when measuring the impedance of a battery, it can be divided into real and imaginary parts. In this diagram, we use an alternating load to measure the battery's internal resistance. The frequency of this alternation, which is the frequency of the load current, changes from 1kHz to 1MHz. 1kHz is a commonly encountered concept, meaning it changes 1000 times per second; 1MHz means it changes once every 1000 seconds. This frequency of change is quite slow, meaning we are actually measuring a DC impedance.
In these two graphs, you can see that DC impedance increases monotonically and linearly as the frequency decreases, while AC impedance shows a changing trend: initially small, then gradually increasing, then decreasing again, and finally increasing again. This is due to the combined effects of factors such as capacitance and inductance within the battery. However, DC impedance increases monotonically as the frequency decreases. For battery power monitoring technology, we are concerned with the DC impedance at 1 MHz. From this graph, we can see that at 1 MHz, the battery impedance deviation is still about 15%. This 15% impedance deviation will cause a capacity error of about 26% under conditions such as 1C discharge current, a 40mV voltage difference between the battery's terminal voltage and open-circuit voltage, and low temperature, if the algorithm used is based on voltage to determine capacity.