HyBattNet: Hybrid Framework for Predicting the Remaining Useful Life of Lithium-Ion Batteries
Accurate prediction of the Remaining Useful Life (RUL) is essential for enabling timely maintenance of lithium-ion batteries, impacting the operational efficiency of electric applications that rely on them. This paper proposes a RUL prediction approach that leverages data from recent charge-discharge cycles to estimate the number of remaining usable cycles. The approach introduces both a novel signal preprocessing pipeline and a deep learning prediction model. In the signal preprocessing pipeline, a derived capacity feature is computed using interpolated current and capacity signals. Alongside original capacity, voltage and current, these features are denoised and enhanced using statistical metrics and a delta-based method to capture differences between the current and previous cycles. In the prediction model, the processed features are then fed into a hybrid deep learning architecture composed of 1D Convolutional Neural Networks (CNN), Attentional Long Short-Term Memory (A-LSTM), and Ordinary Differential Equation-based LSTM (ODE-LSTM) blocks. The ODE-LSTM architecture employs ordinary differential equations to integrate continuous dynamics into sequence-to-sequence modeling, thereby combining continuous and discrete temporal representations, while the A-LSTM incorporates an attention mechanism to capture local temporal dependencies. The model is further evaluated using transfer learning across different learning strategies and target data partitioning scenarios. Results indicate that the model maintains robust performance, even when fine-tuned on limited target data. Experimental results on two publicly available LFP/graphite lithium-ion battery datasets demonstrate that the proposed method outperforms a baseline deep learning approach and machine learning techniques, achieving an RMSE of 101.59, highlighting its potential for real-world RUL prediction applications.
💡 Research Summary
Paper Overview
The manuscript introduces HyBattNet, a hybrid framework designed to predict the Remaining Useful Life (RUL) of lithium‑ion batteries (LIBs) by exploiting information from a short window of recent charge‑discharge cycles. The authors address two major shortcomings of existing approaches: (1) limited exploitation of short‑term degradation cues, and (2) insufficient temporal modeling capacity of current machine‑learning or shallow‑deep‑learning methods. HyBattNet consists of a sophisticated signal‑preprocessing pipeline and a novel deep‑learning architecture that combines 1‑D convolutional neural networks (CNN), an attention‑augmented LSTM (A‑LSTM), and an Ordinary Differential Equation‑based LSTM (ODE‑LSTM). The framework is further evaluated under transfer‑learning scenarios to test robustness when only a small amount of target‑domain data is available.
Signal‑Preprocessing Pipeline
For each cycle i, raw current (I), voltage (V), and capacity (Q) signals are first linearly interpolated onto a uniform current grid of 2000 points, producing a derived capacity trace ˙Q. This step normalizes variable‑length cycles and creates a common representation for subsequent analysis. All four sequences (I, V, Q, ˙Q) are smoothed with a Savitzky‑Golay filter. Six statistical descriptors (mean, std, min, max, variance, median) are extracted from each smoothed series, compressing >2000 data points into six scalars per signal. To capture short‑term dynamics, a delta feature ΔF is computed as the difference between the statistical vector of ˙Q at cycle i and that at i‑δ (δ = 9 cycles). The four feature vectors (I, V, Q, ΔF) are stacked into a 4 × 6 matrix, and ten such matrices are sampled uniformly from the most recent 30‑cycle window, yielding a tensor of shape 10 × 4 × 6. Min‑Max scaling (0‑1) is applied based on the training set.
Hybrid Deep‑Learning Model
The preprocessed tensor is fed into three parallel branches:
-
CNN Block – Three 1‑D convolutional layers (kernel = 5) with channel sizes 64, 128, 256, each followed by batch normalization, LeakyReLU, and a final dropout of 0.3. The block outputs a 10 × 256 feature map that captures local patterns across the ten time steps.
-
A‑LSTM Block – An LSTM equipped with an attention mechanism that learns a weight distribution over the ten time steps, allowing the network to focus on the most informative cycles.
-
ODE‑LSTM Block – Implements continuous‑time dynamics by solving an ODE for the hidden state using the torchdiffeq library. This enables the model to blend discrete sequence modeling with continuous physical evolution, which is particularly suited for battery degradation that follows differential equations.
Outputs from the three branches are concatenated and passed through a fully‑connected layer to produce a scalar RUL estimate (the predicted number of remaining charge‑discharge cycles).
Training & Transfer Learning
The authors pre‑train the entire architecture on a source domain consisting of batteries cycled under diverse charging profiles. For the target domain (different discharge profiles), they fine‑tune only a subset of the parameters using as little as 10 % of the target data. The same scaling parameters from the source training set are retained, ensuring that the model does not overfit to the limited target samples.
Experimental Evaluation
Two publicly available LIB datasets are used: (a) a set with varying charging protocols, and (b) a set with varied discharge protocols. The data split follows the protocol of prior work
Comments & Academic Discussion
Loading comments...
Leave a Comment