Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Local Randomized Neural Networks with Discontinuous Galerkin Methods for Partial Differential Equations (2206.05577v1)

Published 11 Jun 2022 in math.NA and cs.NA

Abstract: Randomized neural networks (RNN) are a variation of neural networks in which the hidden-layer parameters are fixed to randomly assigned values and the output-layer parameters are obtained by solving a linear system by least squares. This improves the efficiency without degrading the accuracy of the neural network. In this paper, we combine the idea of the local RNN (LRNN) and the discontinuous Galerkin (DG) approach for solving partial differential equations. RNNs are used to approximate the solution on the subdomains, and the DG formulation is used to glue them together. Taking the Poisson problem as a model, we propose three numerical schemes and provide the convergence analyses. Then we extend the ideas to time-dependent problems. Taking the heat equation as a model, three space-time LRNN with DG formulations are proposed. Finally, we present numerical tests to demonstrate the performance of the methods developed herein. We compare the proposed methods with the finite element method and the usual DG method. The LRNN-DG methods can achieve better accuracy under the same degrees of freedom, signifying that this new approach has a great potential for solving partial differential equations.

Citations (19)

Summary

We haven't generated a summary for this paper yet.