Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Thermodynamics-informed super-resolution of scarce temporal dynamics data (2402.17506v2)

Published 27 Feb 2024 in physics.comp-ph and cs.LG

Abstract: We present a method to increase the resolution of measurements of a physical system and subsequently predict its time evolution using thermodynamics-aware neural networks. Our method uses adversarial autoencoders, which reduce the dimensionality of the full order model to a set of latent variables that are enforced to match a prior, for example a normal distribution. Adversarial autoencoders are seen as generative models, and they can be trained to generate high-resolution samples from low-resoution inputs, meaning they can address the so-called super-resolution problem. Then, a second neural network is trained to learn the physical structure of the latent variables and predict their temporal evolution. This neural network is known as an structure-preserving neural network. It learns the metriplectic-structure of the system and applies a physical bias to ensure that the first and second principles of thermodynamics are fulfilled. The integrated trajectories are decoded to their original dimensionality, as well as to the higher dimensionality space produced by the adversarial autoencoder and they are compared to the ground truth solution. The method is tested with two examples of flow over a cylinder, where the fluid properties are varied between both examples.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (44)
  1. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998.
  2. Generative adversarial nets. In Z. Ghahramani, M. Welling, C. Cortes, N. Lawrence, and K.Q. Weinberger, editors, Advances in Neural Information Processing Systems, volume 27. Curran Associates, Inc., 2014.
  3. Photo-realistic single image super-resolution using a generative adversarial network, 2017.
  4. Esrgan: Enhanced super-resolution generative adversarial networks, 2018.
  5. Using physics-informed enhanced super-resolution generative adversarial networks for subfilter modeling in turbulent reactive flows. Proceedings of the Combustion Institute, 38(2):2617–2625, 2021.
  6. Super-resolution analysis via machine learning: a survey for fluid flows. Theoretical and Computational Fluid Dynamics, 37(4):421–444, 2023.
  7. Three-dimensional ESRGAN for super-resolution reconstruction of turbulent flows with tricubic interpolation-based transfer learning. Physics of Fluids, 34(12):125126, 12 2022.
  8. Physics-informed cnns for super-resolution of sparse observations on dynamical systems, 2022.
  9. Virtual, digital and hybrid twins: a new paradigm in data-based engineering and engineered data. Archives of computational methods in engineering, 27:105–134, 2020.
  10. Digital twin: Values, challenges and enablers. arXiv preprint arXiv:1910.01719, 2019.
  11. Testing the manifold hypothesis. Journal of the American Mathematical Society, 29(4):983–1049, October 2016.
  12. Real-time deformable models of non-linear tissues by model reduction techniques. Computer Methods and Programs in Biomedicine, 91(3):223–231, 2008.
  13. An augmented reality platform for interactive aerodynamic design and analysis. International Journal for Numerical Methods in Engineering, 120(1):125–138, 2019.
  14. Physically sound, self-learning digital twins for sloshing fluids. PLoS One, 15(6):e0234569, 2020.
  15. Artificial neural network based correction for reduced order models in computational fluid mechanics. Computer Methods in Applied Mechanics and Engineering, 415:116232, 2023.
  16. Deep Learning. MIT Press, Cambridge, MA, USA, 2016. http://www.deeplearningbook.org.
  17. Towards extraction of orthogonal and parsimonious non-linear modes from turbulent flows. Expert Systems with Applications, 202:117038, 2022.
  18. Towards optimal β𝛽\betaitalic_β-variational autoencoders combined with transformers for reduced-order modelling of turbulent flows. International Journal of Heat and Fluid Flow, 105:109254, 2024.
  19. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378:686–707, 2019.
  20. Physics-informed computer vision: A review and perspectives, 2023.
  21. Dynamics and thermodynamics of complex fluids. i. development of a general formalism. Phys. Rev. E, 56:6620–6632, Dec 1997.
  22. Dynamics and thermodynamics of complex fluids. ii. illustrations of a general formalism. Phys. Rev. E, 56:6633–6655, Dec 1997.
  23. Structure-preserving neural networks. Journal of Computational Physics, 426:109950, 2021.
  24. Thermodynamics-informed graph neural networks. IEEE Transactions on Artificial Intelligence, pages 1–1, 2022.
  25. Machine learning structure preserving brackets for forecasting irreversible processes, 2021.
  26. Gfinns: Generic formalism informed neural networks for deterministic and stochastic dynamical systems. Philosophical Transactions of the Royal Society A, 380(2229):20210207, 2022.
  27. Deep learning of thermodynamics-aware reduced-order models from data. Computer Methods in Applied Mechanics and Engineering, 379:113763, 2021.
  28. Multiscale thermo-dynamics: introduction to GENERIC. Walter de Gruyter GmbH & Co KG, 2018.
  29. Weinan E. A proposal on machine learning via dynamical systems. Communications in Mathematics and Statistics, 5(1):1–11, Mar 2017.
  30. Adversarial autoencoders. In International Conference on Learning Representations, 2016.
  31. Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114, 2013.
  32. Digital twins that learn and correct themselves. International Journal for Numerical Methods in Engineering, 123(13):3034–3044, 2022.
  33. Relational inductive biases, deep learning, and graph networks, 2018.
  34. Philip J. Morrison. A paradigm for joined hamiltonian and dissipative systems. Physica D: Nonlinear Phenomena, 18(1):410–419, 1986.
  35. Alexander Mielke. On thermodynamically consistent models and gradient structures for thermoplasticity. GAMM-Mitteilungen, 34(1):51–58, 2011.
  36. Alexander Mielke. Formulation of thermoelastic dissipative material behavior using generic. Continuum Mechanics and Thermodynamics, 23(3):233–256, 2011.
  37. Ignacio Romero. Algorithms for coupled problems that preserve symmetries and the laws of thermodynamics: Part i: Monolithic integrators and their application to finite strain thermoelasticity. Computer Methods in Applied Mechanics and Engineering, 199(25-28):1841–1858, 2010.
  38. Ignacio Romero. Algorithms for coupled problems that preserve symmetries and the laws of thermodynamics: Part ii: Fractional step methods. Computer Methods in Applied Mechanics and Engineering, 199(33-36):2235–2248, 2010.
  39. Physics perception in sloshing scenes with guaranteed thermodynamic consistency. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(2):2136–2150, 2023.
  40. A tensorial approach to computational continuum mechanics using object-oriented techniques. Computer in Physics, 12(6):620–631, 11 1998.
  41. Deep residual learning for image recognition. In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pages 770–778, 2016.
  42. Adam: A method for stochastic optimization. 2017.
  43. Runge-kutta neural network for identification of dynamical systems in high accuracy. IEEE Transactions on Neural Networks, 9(2):294–307, 1998.
  44. Geometric deep learning: Going beyond euclidean data. IEEE Signal Processing Magazine, 34(4):18–42, 2017.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com