Publication Detail

Spatial-Temporal Self-Attention Transformer Networks for Battery State of Charge Estimation


Journal Article

Sustainable Transportation Energy Pathways (STEPS)

Suggested Citation:
Shi, Dapai, Jingyuan Zhao, Zhenghong Wang, Heng Zhao, Junbin Wang, Yubo Lian, Andrew Burke (2023) Spatial-Temporal Self-Attention Transformer Networks for Battery State of Charge Estimation. Electronics 12

Over the past ten years, breakthroughs in battery technology have dramatically propelled the evolution of electric vehicle (EV) technologies. For EV applications, accurately estimating the state-of-charge (SOC) is critical for ensuring safe operation and prolonging the lifespan of batteries, particularly under complex loading scenarios. Despite progress in this area, modeling and forecasting the evaluation of multiphysics and multiscale electrochemical systems under realistic conditions using first-principles and atomistic calculations remains challenging. This study proposes a solution by designing a specialized Transformer-based network architecture, called Bidirectional Encoder Representations from Transformers for Batteries (BERTtery), which only uses time-resolved battery data (i.e., current, voltage, and temperature) as an input to estimate SOC. To enhance the Transformer model’s generalization, it was trained and tested under a wide range of working conditions, including diverse aging conditions (ranging from 100% to 80% of the nominal capacity) and varying temperature windows (from 35 °C to −5 °C). To ensure the model’s effectiveness, a rigorous test of its performance was conducted at the pack level, which allows for the translation of cell-level predictions into real-life problems with hundreds of cells in-series conditions possible. The best models achieve a root mean square error (RMSE) of less than 0.5 test error and approximately 0.1% average percentage error (APE), with maximum absolute errors (MAE) of 2% on the test dataset, accurately estimating SOC under dynamic operating and aging conditions with widely varying operational profiles. These results demonstrate the power of the self-attention Transformer-based model to predict the behavior of complex multiphysics and multiscale battery systems.

Key words: lithium-ion battery, SOC, deep learning, estimation, transformer, electric vehicle