Is the neural tangent kernel of PINNs deep learning general partial differential equations always convergent ?
Abstract
In this paper, we study the neural tangent kernel (NTK) for general partial differential equations (PDEs) based on physics-informed neural networks (PINNs). As we all know, the training of an artificial neural network can be converted to the evolution of NTK. We analyze the initialization of NTK and the convergence conditions of NTK during training for general PDEs. The theoretical results show that the homogeneity of differential operators plays a crucial role for the convergence of NTK. Moreover, based on the PINNs, we validate the convergence conditions of NTK using the initial value problems of the sine-Gordon equation and the initial-boundary value problem of the KdV equation.
- Publication:
-
arXiv e-prints
- Pub Date:
- December 2024
- arXiv:
- arXiv:2412.06158
- Bibcode:
- 2024arXiv241206158Z
- Keywords:
-
- Statistics - Machine Learning;
- Computer Science - Machine Learning;
- Mathematical Physics;
- Nonlinear Sciences - Pattern Formation and Solitons;
- Physics - Computational Physics
- E-Print:
- 18 pages, 5 figures