Publication Detail

Specialized Convolutional Transformer Networks for Estimating Battery Health via Transfer Learning

UCD-ITS-RP-24-59

Journal Article

Suggested Citation:
Zhao, Jingyuan and Zhenghong Wang (2024)

Specialized Convolutional Transformer Networks for Estimating Battery Health via Transfer Learning

. Energy Storage Materials 71

Despite continuous advancements, modeling and predicting nonlinear, multiscale, and multiphysics battery systems, which feature inherently inhomogeneous cascades of scales, remains challenging. Deep learning offers a promising alternative by automatically extracting high-dimensional features, enhancing predictions for complex battery systems. However, existing methods fall short in achieving accurate real-time responses due to high training costs and limited generalization. To address this, we developed specialized deep neural networks for health status estimation using transfer learning. Specifically, our method employs a specialized Transformer model for time series prediction with a multi-head probsparse self-attention mechanism to conserve computational resources and improve estimation efficiency. Additionally, one-dimensional (1-D) convolution captures underlying degradation patterns for state of health (SOH) estimation. Transfer learning enables real-time SOH estimation using only charging features and labeled capacities from previous cycles. The proposed method was validated using two datasets, each with different battery chemistries and charge-discharge configurations: 77 lithium iron phosphate (LFP) batteries (nominal capacity 1.1 Ah) and 30 nickel cobalt aluminum (NCA) batteries (nominal capacity 3.5 Ah). We transferred knowledge from 57 LFP batteries to 20 same-batch batteries, achieving an RMSE of 0.247%, an R² of 99.8%, a WMAPE of 0.223%, and an MAE of 0.202%. Additionally, we applied the transfer learning model trained on the LFP batteries to a dataset of 30 NCA batteries, achieving an RMSE of 0.687%, an R² of 96.8%, a WMAPE of 0.443%, and an MAE of 0.523%. Overall, the proposed specialized network architectures have demonstrated remarkable power in achieving accurate predictions, fast training, and enhanced generalization.


Key words:

battery, health, transformer, convolution, transfer learning