Papers
Topics
Authors
Recent
2000 character limit reached

Deep neural network for solving differential equations motivated by Legendre-Galerkin approximation

Published 24 Oct 2020 in math.NA, cs.LG, and cs.NA | (2010.12975v1)

Abstract: Nonlinear differential equations are challenging to solve numerically and are important to understanding the dynamics of many physical systems. Deep neural networks have been applied to help alleviate the computational cost that is associated with solving these systems. We explore the performance and accuracy of various neural architectures on both linear and nonlinear differential equations by creating accurate training sets with the spectral element method. Next, we implement a novel Legendre-Galerkin Deep Neural Network (LGNet) algorithm to predict solutions to differential equations. By constructing a set of a linear combination of the Legendre basis, we predict the corresponding coefficients, $\alpha_i$ which successfully approximate the solution as a sum of smooth basis functions $u \simeq \sum_{i=0}{N} \alpha_i \varphi_i$. As a computational example, linear and nonlinear models with Dirichlet or Neumann boundary conditions are considered.

Citations (6)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.