Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Finite-difference least square methods for solving Hamilton-Jacobi equations using neural networks (2406.10758v4)

Published 15 Jun 2024 in math.NA, cs.NA, math.AP, and math.OC

Abstract: We present a simple algorithm to approximate the viscosity solution of Hamilton-Jacobi (HJ) equations by means of an artificial deep neural network. The algorithm uses a stochastic gradient descent-based method to minimize the least square principle defined by a monotone, consistent numerical scheme. We analyze the least square principle's critical points and derive conditions that guarantee that any critical point approximates the sought viscosity solution. The use of a deep artificial neural network on a finite difference scheme lifts the restriction of conventional finite difference methods that rely on computing functions on a fixed grid. This feature makes it possible to solve HJ equations posed in higher dimensions where conventional methods are infeasible. We demonstrate the efficacy of our algorithm through numerical studies on various canonical HJ equations across different dimensions, showcasing its potential and versatility.

Summary

We haven't generated a summary for this paper yet.