Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Hard Encoding of Physics for Learning Spatiotemporal Dynamics (2105.00557v1)

Published 2 May 2021 in cs.LG, cs.AI, and physics.comp-ph

Abstract: Modeling nonlinear spatiotemporal dynamical systems has primarily relied on partial differential equations (PDEs). However, the explicit formulation of PDEs for many underexplored processes, such as climate systems, biochemical reaction and epidemiology, remains uncertain or partially unknown, where very limited measurement data is yet available. To tackle this challenge, we propose a novel deep learning architecture that forcibly encodes known physics knowledge to facilitate learning in a data-driven manner. The coercive encoding mechanism of physics, which is fundamentally different from the penalty-based physics-informed learning, ensures the network to rigorously obey given physics. Instead of using nonlinear activation functions, we propose a novel elementwise product operation to achieve the nonlinearity of the model. Numerical experiment demonstrates that the resulting physics-encoded learning paradigm possesses remarkable robustness against data noise/scarcity and generalizability compared with some state-of-the-art models for data-driven modeling.

Citations (7)

Summary

We haven't generated a summary for this paper yet.