Deep Adaptive Basis Galerkin Method for High-Dimensional Evolution Equations With Oscillatory Solutions
Abstract
In this paper, we study deep neural networks (DNNs) for solving high-dimensional evolution equations with oscillatory solutions. Different from deep least-squares methods that deal with time and space variables simultaneously, we propose a deep adaptive basis Galerkin (DABG) method, which employs the spectral-Galerkin method for the time variable of oscillatory solutions and the deep neural network method for high-dimensional space variables. The proposed method can lead to a linear system of differential equations having unknown DNNs that can be trained via the loss function. We establish a posterior estimates of the solution error, which is bounded by the minimal loss function and the term $O(N^{-m})$, where $N$ is the number of basis functions and $m$ characterizes the regularity of the e'quation. We also show that if the true solution is a Barron-type function, the error bound converges to zero as $M=O(N^p)$ approaches to infinity, where $M$ is the width of the used networks, and $p$ is a positive constant. Numerical examples, including high-dimensional linear evolution equations and the nonlinear Allen-Cahn equation, are presented to demonstrate the performance of the proposed DABG method is better than that of existing DNNs.
- Publication:
-
SIAM Journal on Scientific Computing
- Pub Date:
- October 2022
- DOI:
- 10.1137/21M1468383
- arXiv:
- arXiv:2112.14418
- Bibcode:
- 2022SJSC...44A3130G
- Keywords:
-
- Mathematics - Numerical Analysis