Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Incorporating NODE with Pre-trained Neural Differential Operator for Learning Dynamics (2106.04166v3)

Published 8 Jun 2021 in cs.LG

Abstract: Learning dynamics governed by differential equations is crucial for predicting and controlling the systems in science and engineering. Neural Ordinary Differential Equation (NODE), a deep learning model integrated with differential equations, is popular in learning dynamics recently due to its robustness to irregular samples and its flexibility to high-dimensional input. However, the training of NODE is sensitive to the precision of the numerical solver, which makes the convergence of NODE unstable, especially for ill-conditioned dynamical systems. In this paper, to reduce the reliance on the numerical solver, we propose to enhance the supervised signal in the training of NODE. Specifically, we pre-train a neural differential operator (NDO) to output an estimation of the derivatives to serve as an additional supervised signal. The NDO is pre-trained on a class of basis functions and learns the mapping between the trajectory samples of these functions to their derivatives. To leverage both the trajectory signal and the estimated derivatives from NDO, we propose an algorithm called NDO-NODE, in which the loss function contains two terms: the fitness on the true trajectory samples and the fitness on the estimated derivatives that are outputted by the pre-trained NDO. Experiments on various kinds of dynamics show that our proposed NDO-NODE can consistently improve the forecasting accuracy with one pre-trained NDO. Especially for the stiff ODEs, we observe that NDO-NODE can capture the transitions in the dynamics more accurately compared with other regularization methods.

Citations (2)

Summary

We haven't generated a summary for this paper yet.