Papers
Topics
Authors
Recent
Search
2000 character limit reached

MambaLithium: Selective state space model for remaining-useful-life, state-of-health, and state-of-charge estimation of lithium-ion batteries

Published 8 Mar 2024 in cs.CE | (2403.05430v1)

Abstract: Recently, lithium-ion batteries occupy a pivotal position in the realm of electric vehicles and the burgeoning new energy industry. Their performance is heavily dependent on three core states: remaining-useful-life (RUL), state-of-health (SOH), and state-of-charge (SOC). Given the remarkable success of Mamba (Structured state space sequence models with selection mechanism and scan module, S6) in sequence modeling tasks, this paper introduces MambaLithium, a selective state space model tailored for precise estimation of these critical battery states. Leveraging Mamba algorithms, MambaLithium adeptly captures the intricate aging and charging dynamics of lithium-ion batteries. By focusing on pivotal states within the battery's operational envelope, MambaLithium not only enhances estimation accuracy but also maintains computational robustness. Experiments conducted using real-world battery data have validated the model's superiority in predicting battery health and performance metrics, surpassing current methods. The proposed MambaLithium framework is potential for applications in advancing battery management systems and fostering sustainable energy storage solutions. Source code is available at https://github.com/zshicode/MambaLithium.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (23)
  1. G. Chen, W. Peng, and F. Yang, “An lstm-sa model for soc estimation of lithium-ion batteries under various temperatures and aging levels,” Journal of Energy Storage, vol. 84, p. 110906, 2024.
  2. J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, “Bert: Pre-training of deep bidirectional transformers for language understanding,” CoRR, 2018.
  3. A. Gu and T. Dao, “Mamba: Linear-time sequence modeling with selective state spaces,” arXiv preprint arXiv:2312.00752, 2023.
  4. C. Jin, J. Gao, Z. Shi, and H. Zhang, “Attcry: Attention-based neural network model for protein crystallization prediction,” Neurocomputing, vol. 463, pp. 265–274, 2021.
  5. C. Jin, Z. Shi, C. Kang, K. Lin, and H. Zhang, “Tlcrys: Transfer learning based method for protein crystallization prediction,” International Journal of Molecular Sciences, vol. 23, p. 972, 1 2022.
  6. C. Jin, Z. Shi, W. Li, and Y. Guo, “Bidirectional lstm-crf attention-based model for chinese word segmentation,” arXiv preprint arXiv:2105.09681, 2021.
  7. C. Jin, Z. Shi, K. Lin, and H. Zhang, “Predicting mirna-disease association based on neural inductive matrix completion with graph autoencoders and self-attention mechanism,” Biomolecules, vol. 12, no. 1, p. 64, 2022.
  8. C. Jin, Z. Shi, H. Zhang, and Y. Yin, “Predicting lncrna–protein interactions based on graph autoencoders and collaborative training,” in IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Houston, USA, 9-12 December, 2021.
  9. R. E. Kalman, “A new approach to linear filtering and prediction problems,” Journal of Fluids Engineering, vol. 82, no. 1, pp. 35–44, 1960.
  10. D. P. Kingma and J. Ba, “Adam: a method for stochastic optimization,” in 3rd International Conference for Learning Representations(ICLR), 2015.
  11. J.-z. Kong, F. Yang, X. Zhang, E. Pan, Z. Peng, and D. Wang, “Voltage-temperature health feature extraction to improve prognostics and health management of lithium-ion batteries,” Energy, vol. 223, p. 120114, 2021.
  12. K. Lin, X. Quan, C. Jin, Z. Shi, and J. Yang, “An interpretable double-scale attention model for enzyme protein class prediction based on transformer encoders and multi-scale convolutions,” Frontiers in Genetics, vol. 13, p. 885627, 2022.
  13. Z. Shi, “Incorporating transformer and lstm to kalman filter with em algorithm for state estimation,” arXiv preprint arXiv:2105.00250, 2021.
  14. ——, “Differential equation and probability inspired graph neural networks for latent variable learning,” arXiv preprint arXiv:2202.13800, 2022.
  15. ——, “Mambastock: Selective state space model for stock prediction,” arXiv preprint arXiv:2402.18959, 2024.
  16. Z. Shi, Y. Hu, G. Mo, and J. Wu, “Attention-based cnn-lstm and xgboost hybrid model for stock prediction,” arXiv preprint arXiv:2204.02623, 2022.
  17. Z. Shi and B. Li, “Graph neural networks and attention-based cnn-lstm for protein classification,” arXiv preprint arXiv:2204.09486, 2022.
  18. Z. Shi, H. Zhang, C. Jin, X. Quan, and Y. Yin, “A representation learning model based on variational inference and graph autoencoder for predicting lncrna-disease associations,” BMC Bioinformatics, vol. 22, p. 136, 2021.
  19. I. Sutskever, O. Vinyals, and Q. V. Le, “Sequence to sequence learning with neural networks,” in NeurIPS, 2014.
  20. A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Kaiser, and I. Polosukhin, “Attention is all you need,” in NeurIPS, 2017.
  21. P. Wen, Z.-S. Ye, Y. Li, S. Chen, P. Xie, and S. Zhao, “Physics-informed neural networks for prognostics and health management of lithium-ion batteries,” IEEE Transactions on Intelligent Vehicles, vol. 9, no. 1, pp. 2276–2289, 2024.
  22. F. Yang, X. Song, F. Xu, and K.-L. Tsui, “State-of-charge estimation of lithium-ion batteries via long short-term memory network,” IEEE Access, vol. 7, pp. 53 792–53 799, 2019.
  23. G. Zhang, “Time series forecasting using a hybrid arima and neural network model,” Neurocomputing, vol. 50, pp. 159–175, 2003.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.