AQ-PINNs: Attention-Enhanced Quantum Physics-Informed Neural Networks for Carbon-Efficient Climate Modeling (2409.01626v1)
Abstract: The growing computational demands of AI in addressing climate change raise significant concerns about inefficiencies and environmental impact, as highlighted by the Jevons paradox. We propose an attention-enhanced quantum physics-informed neural networks model (AQ-PINNs) to tackle these challenges. This approach integrates quantum computing techniques into physics-informed neural networks (PINNs) for climate modeling, aiming to enhance predictive accuracy in fluid dynamics governed by the Navier-Stokes equations while reducing the computational burden and carbon footprint. By harnessing variational quantum multi-head self-attention mechanisms, our AQ-PINNs achieve a 51.51% reduction in model parameters compared to classical multi-head self-attention methods while maintaining comparable convergence and loss. It also employs quantum tensor networks to enhance representational capacity, which can lead to more efficient gradient computations and reduced susceptibility to barren plateaus. Our AQ-PINNs represent a crucial step towards more sustainable and effective climate modeling solutions.
- AI and compute: How much longer can computing power drive artificial intelligence progress? Technical report, Center for Security and Emerging Technology, January 2022.
- Carbon emissions and large neural network training. arXiv preprint arXiv:2104.10350, 2021.
- mlco2/codecarbon: v2.4.1 (v2.4.1). Zenodo, 2024.
- Carbontracker: Tracking and predicting the carbon footprint of training deep learning models. arXiv preprint arXiv:2007.03051, 2020.
- Eco2ai: carbon emissions tracking of machine learning models as the first step towards sustainable ai. In Doklady Mathematics, volume 106, pages S118–S128. Springer, 2022.
- Unraveling the complexity of the jevons paradox: The link between innovation, efficiency, and sustainability. Frontiers in Energy Research, 6, 2018.
- An introduction to quantum machine learning. Contemporary Physics, 56(2):172–185, 2015.
- Quantum machine learning. Nature, 549(7671):195–202, 2017.
- A survey on quantum machine learning: Current trends, challenges, opportunities, and the road ahead. arXiv preprint arXiv:2310.10315, 2023.
- Towards quantum machine learning with tensor networks. Quantum Science and Technology, 4(2):024001, jan 2019.
- Quantum tensor network in machine learning: An application to tiny object classification. arXiv preprint arXiv:2101.03154, 2021.
- Training classical neural networks by quantum machine learning. arXiv preprint arXiv:2402.16465, 2024.
- A review: Fundamentals of computational fluid dynamics (cfd). In AIP conference proceedings, volume 2030. AIP Publishing, 2018.
- Ramon Codina. Numerical solution of the incompressible navier–stokes equations with coriolis forces based on the discretization of the total time derivative. Journal of Computational Physics, 148(2):467–496, 1999.
- Simplified two-dimensional model for global atmospheric dynamics. Physics of Fluids, 34(11), 2022.
- Typhoon eye trajectory based on a mathematical model: Comparing with observational data. Nonlinear Analysis: Real World Applications, 11(3):1847–1861, 2010.
- EPINN-NSE: Enhanced physics-informed neural networks for solving navier-stokes equations. arXiv preprint arXiv:2304.03689, 2023.
- Bifurcation analysis of ocean, atmosphere, and climate models. In Roger M. Temam and Joseph J. Tribbia, editors, Special Volume: Computational Methods for the Atmosphere and the Oceans, volume 14 of Handbook of Numerical Analysis, pages 187–229. Elsevier, 2009.
- Physics informed deep learning (part i): Data-driven solutions of nonlinear partial differential equations. arXiv preprint arXiv:1711.10561, 2017.
- Physics informed deep learning (part ii): Data-driven discovery of nonlinear partial differential equations. arXiv preprint arXiv:1711.10566, 2017.
- Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378:686–707, 2019.
- Jacob Biamonte. Lectures on quantum tensor networks. arXiv preprint arXiv:1912.10049, 2019.
- Eyup B. Unlu et al. Hybrid quantum vision transformers for event classification in high energy physics. Axioms, 13(3), 2024.
- A light-weight quantum self-attention model for classical data classification. Applied Intelligence, 2024.
- QADQN: Quantum attention deep Q-network for financial market prediction. arXiv preprint arXiv:2408.03088, 2024.
- Super-convergence: Very fast training of neural networks using large learning rates. In Artificial intelligence and machine learning for multi-domain operations applications, volume 11006, pages 369–386. SPIE, 2019.
- Scientific machine learning through physics–informed neural networks: Where we are and what’s next. Journal of Scientific Computing, 92(3):88, 2022.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.