Neural Network Optimization Under Partial Differential Equation Constraints
Abstract
Enforcing physical constraints to solutions generated by neural networks (NN) remains a challenge, yet it is essential to their accuracy and trustworthiness. We propose a novel differentiable spectral projection layer that efficiently enforces spatial PDE constraints using spectral methods, yet is fully differentiable, facilitating end-to-end training. We train a 3D Conditional Generative Adversarial Network for turbulence superresolution, whilst guaranteeing the spatial constraint of zero divergence. Results show that the model produces realistic flow fields with more accurate flow statistics when trained with hard constraints, compared to soft constrained and unconstrained baselines. We also present a method of applying multiple PDE constraints by modifying the loss function directly. We provide theoretical guarantees of convergence and evaluate the computational complexity of the method. We offer an approximation which trades convergence guarantees for improved speed. Experimentally, we train constrained NN to learn continuous representations of solutions to the linear Helmholtz equation and the nonlinear steady-state Navier-Stokes equation. We show that the model outputs better respect the underlying physics, but note the complexity restricts its application to small NN.
- Publication:
-
APS Division of Fluid Dynamics Meeting Abstracts
- Pub Date:
- November 2019
- Bibcode:
- 2019APS..DFDC17008K