Contributed Commentary by Saurabh Saxena and Michael Pecht
June 16, 2020 | Lithium-ion batteries continue to expand their applicability across different applications, due to their higher energy density, low maintenance, and reducing cost. The market for lithium-ion batteries is expected to register a compound annual growth rate (CAGR) of approximately 22%, during the forecast period (2019-2024). However, the technological growth of these batteries lagged behind that of conventional electronic systems resulting in batteries being Achilles’ heel for the reliability of electronic systems and products. These batteries degrade gradually due to various degradation mechanisms and can also fail catastrophically. As more and more battery-powered products enter the market, the challenges increase exponentially for battery manufactures, suppliers, and product manufacturers. A survey covering professionals from a broad spectrum of industry segments, including battery cell producers, battery pack and component developers, academic and national labs, and companies involved in transportation, consumer electronics, and energy storage revealed time-to-market as the biggest concern among respondents followed by battery reliability.
Industry conducts life testing of Li-ion batteries by assessing their capacity and power fade over time for the targeted applications. However, testing at normal operating conditions can be quite time consuming and can take even half a year. Hence there is a need for developing accelerated test designs and exploring stress factors that can be used to reduce test time. We at the Center for Advanced Life Cycle Engineering (CALCE) at the University of Maryland have been working extensively with industry to find solutions to these critical issues. Batteries undergo a variety of life cycle conditions which are broadly classified into storage and cycling operations. These operations have associated stress parameters such as temperature, charge current, discharge current, depth of discharge, and rest time. While there has been a lot of work in the past understanding the effects of these parameters, many discrepancies remain due to the wide variety of battery materials and designs used in the industry. Hence, we have developed a machine learning-based methodology to quickly identify most accelerating stress factors using one-time testing, which could further be used in accelerated test planning for any battery chemistry/design. We are also looking at the combination of temperature, discharge current and rest time after charge stress factors to develop cycle test profiles which can reduce testing time.
Stress Factor Ranking
Our methodology involves conducting a one-time screening design of experiment (DOE) consisting of multiple stress factors that are relevant for the continuous cycle (constant current constant voltage charge—constant current discharge) testing of batteries. This includes the selection of the range of stress factors that can be represented by just 2 levels, a reduction in the number of tests using half-fractional designs, and the utilization of machine learning techniques such as least absolute shrinkage and selection operator (LASSO) and random forest (RF) to rank those factors. Figure 1 shows a bar plot depicting the probabilities of the effects of individual stress factors and their two-way interaction effects on battery capacity fade being zero. Two-way interaction effect refers to the combined effects of the constituent stress factors where the joint effect is significantly different than the sum of the individual effects. A low probability means that the corresponding stress factor is unlikely to have zero influence on the capacity fade indicating that it is highly significant for acceleration purposes. We found that the temperature and its interaction with charge cut-off current during the constant voltage charging phase were the two most significant factors for accelerating the capacity fade of the tested Li-ion battery.
Accelerating Stress Factors
While a battery’s rest operation is quite relevant in consumer electronics and electric vehicles where batteries may spend considerable amount without being used, we find that it has not been given due attention in the literature, especially the rest time introduced after full charge during the charge-discharge cycling operation. When the battery-powered devices are plugged-in to the charger, the battery undergoes some combinations of constant current (CC) and constant voltage (CV) charge steps, which can vary from one device manufacturer to the other. In a conventional CCCV profile, battery charging is terminated during the CV phase after the battery charge current reduces below a threshold value such as C/20. The charge termination is conducted to extend the battery life and ensure safe operation. If the device continues to remain in plugged-in condition for long durations such as overnight, then the battery will spend many hours under the open rest condition. During the open rest condition, the battery voltage will drop and once it drops below a predefined threshold such as 200 mV[1], then the charger will again resume the CV charge operation. We have studied the effects of open rest time after fully charged condition (constant current constant voltage charge—rest—constant current discharge) on battery capacity fade under different ambient temperatures and investigated if this factor had the potential to accelerate the battery testing. We found that increasing the rest time could reduce the time to reach an end of life threshold such as 80% of the initial capacity at elevated temperatures such as 45 oC (Figure 2) or beyond. Batteries under rest time of 24 hr after full charge reached the 80% of their initial capacities in just 40 days and had ~10% more capacity fade compared to batteries under 10 min of rest time for a fixed discharge C-rate of 0.5C (Figure 2). We also compared the acceleration performance of the rest time factor to that of the discharge current stress factor and found that beyond a critical value the discharge current could achieve higher acceleration than the rest time factor.
Saurabh Saxena received the B.Tech. degree in electrical engineering from the Indian Institute of Technology (Banaras Hindu University), Varanasi, India, in 2011. He is currently a Ph.D. student at the CALCE Center at the University of Maryland, College Park, USA. Prior to joining the Ph.D. program in 2014, he worked as project assistant at the Indian Institute of Science Bangalore, India for ten months on a project related to the modeling of Li-ion batteries and supercapacitors. He has been involved in various battery research projects for last 7 years. His research interests include accelerated testing and modeling, failure analysis, reliability, safety, and prognostics of Lithium-ion batteries.
Prof Michael Pecht (30,000+ citations, 80+ H-Index) has a BS in Physics, an MS in Electrical Engineering and an MS and PhD in Engineering Mechanics from the University of Wisconsin. He is a Professional Engineer, an IEEE Fellow, an ASM Fellow, and an ASME Fellow. He served as editor-in-chief of IEEE Access for six years, as editor-in-chief of IEEE Transactions on Reliability for nine years, editor-in-chief of Microelectronics Reliability for sixteen years, and editor of Circuit World. He has also served on three U.S. National Academy of Science studies, two US Congressional investigations in automotive safety, and as an expert to the U.S. FDA. He is the Director of CALCE (Center for Advanced Life Cycle Engineering) at the University of Maryland (UMD), which is funded by over 150 of the world’s leading electronics companies at more than US$6M/year. He can be reached at pecht@umd.edu.