There’s theory and then there’s reality. The connection between them can often be tenuous at best. In the world of electric cars, each battery pack has a battery management system that uses algorithms to monitor its overall health and operating condition. “The algorithm tells you things like if your battery is doing okay, or how far you can drive before you need to recharge. The problem is that BMS algorithms are designed in ideal laboratory conditions that do not reflect what a battery pack sees in the real world,” says Simona Onori, an assistant professor of energy science and engineering in the Stanford Doerr School of Sustainability.
To demonstrate the gap between controlled laboratory testing and actual road experience, Onori and colleagues at Stanford collaborated with researchers at the Volkswagen Innovation and Engineering Center located near the university campus. “Algorithms based on unrealistic driving data are likely to be inaccurate in the field,” said Onori, lead author of the study. “Our goal is to increase the longevity of the battery pack by designing algorithms trained from real-world data.”
The results of the research were published on August 18, 2023 in the scientific journal Joule (paywall). Here is a summary of that report:
“Deploying battery state of health estimation and forecasting algorithms are critical for ensuring the reliable performance of battery electric vehicles. SoH algorithms are designed and trained from data collected in the laboratory upon cycling cells under predefined loads and temperatures.
“Field battery pack data collected over 1 year of vehicle operation are used to define and extract performance/health indicators and correlate them to real driving characteristics (charging habits, acceleration, and braking) and season-dependent ambient temperature. Performance indicators during driving and charging events are defined upon establishing a data pipeline to extract key battery management system signals.
“This work shows the misalignment existing between laboratory testing and actual battery usage, and the opportunity that exists in enhancing battery experimental testing to deconvolute time and temperature to improve SoH estimation strategies.”
What A Battery Management System Does
A battery management system today routinely records data during braking, acceleration, deceleration, and charging. The secret to long life for rechargeable batteries may lie in understanding the differences between individual cells in real world operation. New modeling of how lithium-ion cells in a pack degrade shows a way to tailor charging to each cell’s capacity so EV batteries can handle more charge cycles and a longer service life.
“Real-world driving is driver speciﬁc,” says co-author Gabriele Pozzato, a Stanford research engineer. “You might be an aggressive driver, or someone who only partially charges their car. Different styles of driving and charging will result in different trajectories of battery degradation. However, that kind of field data is not included in conventional battery algorithms.”
For the study, Volkswagen provided the Stanford team about 3,750 hours of BMS driving data collected from an all-electric Audi e-tron SUV driven in the San Francisco Bay Area for one year, from November 2019 to October 2020. That data allowed the Stanford team to calculate the electrical resistance in the battery pack during that period. The data allowed the team to assess two key battery metrics — energy and power.
“Energy gives you the range, or how many miles you can drive with a fully charged battery,” Pozzato says. “Power is the ability to extract energy quickly. When you accelerate, you want to get access to energy and discharge the battery very fast. The less electrical resistance you have in the battery, the more power you have.”
To calculate resistance, the researchers measured abrupt changes in current and voltage in the battery pack using data from 529 acceleration events and 392 braking events during the year. They also calculated impedance — a measure of resistance during battery charging — by analyzing 53 charging sessions. “Impedance and resistance are typically considered metrics of battery health,” Onori said. “The more you drive, the more resistance increases. This usually translates into less available power from the battery pack, but that’s not what we saw.”
A more complex pattern emerged when the researchers added seasonal weather data to the mix. They discovered that electrical resistance decreased in cooler months and steadily increased in spring and summer, an indication that battery health improves as temperatures rise.
“Higher temperatures raise battery capacity, so you have this feeling that the car has more energy and that you can drive more miles,” Onori said. “But if you keep using the battery at high temperatures, it will degrade faster. Those are very tricky factors that affect performance. Next year we’ll expand our dataset to a fleet of vehicles to determine exactly how temperature and aging affect each other.”
Theory Vs. Reality
Automakers rely on conventional battery management algorithms designed in ideal laboratory conditions. Using machine learning, these algorithms typically monitor performance data from a single 4-volt battery cell that continuously charges and discharges at a constant temperature until it dies. But the Audi field data was collected from a 396-volt battery pack powered by 384 cells.
“New algorithms should focus on the entire battery pack and not individual cells,” Onori says. “We want to design algorithms that educate drivers on how to increase the life of the battery pack, which is the most expensive component of the vehicle. For example, you could alert drivers if they are fast-charging too much or accelerating too aggressively. So much can be learned from field data to make BMS algorithms more robust.”
The study was assisted by members of the Volkswagen Innovation and Engineering Center in Belmont, California, just a hop, skip, and a jump away from the Stanford campus. Computational resources were provided by the Stanford Research Computing Center. The authors have filed three patent applications related to this work.
This research is an example of how new information can lead to new insights into how to maximize the performance of battery packs for electric cars. The EV era today is about where the transition to internal combustion engines was a century ago. While the theory is that the engine in a 1923 Ford operated on the same basic principles as the engine in a 2023 Ford F-150 — often reduced to the ultra simplistic phrase “Suck, Push, Bang, Blow” — the reality is the modern engine is light years ahead of its predecessor in terms of the extracting the maximum amount of energy from a drop of gasoline.
EV technology is following a similar path of constant improvement. Studies such as this one at Stanford will help future electric cars maximize the performance and longevity of their battery packs. A better battery management system may be critical to making it possible for EVs to export their stored energy to power external loads without shortening their useful life.
Theories are great starting points, but there is no substitute for real world experience.
Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.
EV Obsession Daily!
Tesla Sales in 2023, 2024, and 2030
CleanTechnica uses affiliate links. See our policy here.