State-of-charge estimation for lithium-ion batteries with optimized self-supervised transformer deep learning model

State-of-charge(SOC) is a quantity that reflects the amount of available energy left in lithium-ion(Li-ion)cells. Accurate SOC estimation allows for the optimization of charging-discharging schedule and usage time and facilitates the computation of other quantities needed to prolong battery lifespan...

Full description

Saved in:
Bibliographic Details
Main Author: Dickson Neoh Tze How, Dr.
Format: text::Thesis
Language:English
Published: 2023
Subjects:
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.uniten.dspace-19342
record_format dspace
institution Universiti Tenaga Nasional
building UNITEN Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Tenaga Nasional
content_source UNITEN Institutional Repository
url_provider http://dspace.uniten.edu.my/
language English
topic State-of-charge estimation for lithium-ion batteries
spellingShingle State-of-charge estimation for lithium-ion batteries
Dickson Neoh Tze How, Dr.
State-of-charge estimation for lithium-ion batteries with optimized self-supervised transformer deep learning model
description State-of-charge(SOC) is a quantity that reflects the amount of available energy left in lithium-ion(Li-ion)cells. Accurate SOC estimation allows for the optimization of charging-discharging schedule and usage time and facilitates the computation of other quantities needed to prolong battery lifespan and protect users from hazards. However, the SOC is not an observable quantity and cannot be practically measured outside the laboratory environment. Additionally, the SOC behaves in a highly nonlinear way and is a function of various incalculable factors such as cell condition, ambient temperature, manufacturing inconsistency, and cell chemistry. Fortunately, these issues can be addressed with machine learning methods such as deep learning(DL) due to its capacity in modeling highly non-linear phenomena, powerful generalization capability, and fast run-time. Nevertheless, the performance of DL models varies heavily depending on data type, model architecture, hyperparameter selection, training framework, and so forth. To overcome these limitations, this study proposes a novel DL architecture for SOC estimation using the Transformer model with the self-supervised learning (SSL) framework. The SSL framework trains the Transformer in two stages. In the first stage, the model is pre-trained using unlabeled data with unsupervised learning. In the second stage, the model is fine-tuned or re-trained using labeled data with supervised learning. To evaluate its effectiveness, the Transformer model is benchmarked against various commonly used DL architectures such as Long Short-Term Memory(LSTM), Gated Recurrent Unit(GRU), Convolutional Neural Networks (CNN), and Deep Neural Networks (DNN). The evaluation is carried out based on multiple electric vehicles (EV) drive cycles with different Li-ion cells chemistry at varying ambient temperatures. Experiment results show that the Transformer model achieves an error metric of RMSE ≤ 1.7% and MAE ≤ 1% on all datasets while maintaining a relatively low computation cost with a model size of approximately 2MB. The use of SSL framework also reduces the need for labeled data during training and significantly decreases training time. In addition, using the SSL framework makes transfer learning (TL) possible in which the weights of a model trained on specific cell chemistry can be transferred to other models running on different cell chemistry. In this study, the weights of a model trained on an NMC cell are transferred to an NCA cell and vice versa. TL results in the model scoring an RMSE ≈ 2.0% or lower with only five training epochs(approximately 30 minutes on an RTX3090GPU). The Transformer model with transferred weights outperformed models trained from scratch using supervised learning. To select the optimal hyperparameters for the Transformer model, the Tree Parzen Estimator(TPE) optimization in combination with the Hyperband pruning algorithm is employed to search for the best combination that yields the lowest Root Mean Squared Error(RMSE)and Mean Absolute Error (MAE) error metrics. The outcome is an optimized Transformer model that scores the lowest errorofRMSE≈1.12%. The optimized model also outperformed all other state-of-the art DL models on error metrics, adaptability, and robustness under diverse operating conditions. This study concludes that the proposed optimized Transformer model has great potential to be incorporated in Li-ion energy storage systems to estimate SOC with very low estimation error and broad applicability to various cell types. All models in this study were implemented using the open-source PyTorch and TSAI deep learning package on an Ubuntu 20.04 LTS machine with an RTX3090 graphical processing unit.
format Resource Types::text::Thesis
author Dickson Neoh Tze How, Dr.
author_facet Dickson Neoh Tze How, Dr.
author_sort Dickson Neoh Tze How, Dr.
title State-of-charge estimation for lithium-ion batteries with optimized self-supervised transformer deep learning model
title_short State-of-charge estimation for lithium-ion batteries with optimized self-supervised transformer deep learning model
title_full State-of-charge estimation for lithium-ion batteries with optimized self-supervised transformer deep learning model
title_fullStr State-of-charge estimation for lithium-ion batteries with optimized self-supervised transformer deep learning model
title_full_unstemmed State-of-charge estimation for lithium-ion batteries with optimized self-supervised transformer deep learning model
title_sort state-of-charge estimation for lithium-ion batteries with optimized self-supervised transformer deep learning model
publishDate 2023
_version_ 1806424412664102912
spelling my.uniten.dspace-193422023-05-04T15:15:08Z State-of-charge estimation for lithium-ion batteries with optimized self-supervised transformer deep learning model Dickson Neoh Tze How, Dr. State-of-charge estimation for lithium-ion batteries State-of-charge(SOC) is a quantity that reflects the amount of available energy left in lithium-ion(Li-ion)cells. Accurate SOC estimation allows for the optimization of charging-discharging schedule and usage time and facilitates the computation of other quantities needed to prolong battery lifespan and protect users from hazards. However, the SOC is not an observable quantity and cannot be practically measured outside the laboratory environment. Additionally, the SOC behaves in a highly nonlinear way and is a function of various incalculable factors such as cell condition, ambient temperature, manufacturing inconsistency, and cell chemistry. Fortunately, these issues can be addressed with machine learning methods such as deep learning(DL) due to its capacity in modeling highly non-linear phenomena, powerful generalization capability, and fast run-time. Nevertheless, the performance of DL models varies heavily depending on data type, model architecture, hyperparameter selection, training framework, and so forth. To overcome these limitations, this study proposes a novel DL architecture for SOC estimation using the Transformer model with the self-supervised learning (SSL) framework. The SSL framework trains the Transformer in two stages. In the first stage, the model is pre-trained using unlabeled data with unsupervised learning. In the second stage, the model is fine-tuned or re-trained using labeled data with supervised learning. To evaluate its effectiveness, the Transformer model is benchmarked against various commonly used DL architectures such as Long Short-Term Memory(LSTM), Gated Recurrent Unit(GRU), Convolutional Neural Networks (CNN), and Deep Neural Networks (DNN). The evaluation is carried out based on multiple electric vehicles (EV) drive cycles with different Li-ion cells chemistry at varying ambient temperatures. Experiment results show that the Transformer model achieves an error metric of RMSE ≤ 1.7% and MAE ≤ 1% on all datasets while maintaining a relatively low computation cost with a model size of approximately 2MB. The use of SSL framework also reduces the need for labeled data during training and significantly decreases training time. In addition, using the SSL framework makes transfer learning (TL) possible in which the weights of a model trained on specific cell chemistry can be transferred to other models running on different cell chemistry. In this study, the weights of a model trained on an NMC cell are transferred to an NCA cell and vice versa. TL results in the model scoring an RMSE ≈ 2.0% or lower with only five training epochs(approximately 30 minutes on an RTX3090GPU). The Transformer model with transferred weights outperformed models trained from scratch using supervised learning. To select the optimal hyperparameters for the Transformer model, the Tree Parzen Estimator(TPE) optimization in combination with the Hyperband pruning algorithm is employed to search for the best combination that yields the lowest Root Mean Squared Error(RMSE)and Mean Absolute Error (MAE) error metrics. The outcome is an optimized Transformer model that scores the lowest errorofRMSE≈1.12%. The optimized model also outperformed all other state-of-the art DL models on error metrics, adaptability, and robustness under diverse operating conditions. This study concludes that the proposed optimized Transformer model has great potential to be incorporated in Li-ion energy storage systems to estimate SOC with very low estimation error and broad applicability to various cell types. All models in this study were implemented using the open-source PyTorch and TSAI deep learning package on an Ubuntu 20.04 LTS machine with an RTX3090 graphical processing unit. 2023-05-03T13:30:05Z 2023-05-03T13:30:05Z 2022-06 Resource Types::text::Thesis https://irepository.uniten.edu.my/handle/123456789/19342 en application/pdf
score 13.214268