Using Machine Learning to Augment CoarseGrid Computational Fluid Dynamics Simulations
Abstract
Simulation of turbulent flows at high Reynolds number is a computationally challenging task relevant to a large number of engineering and scientific applications in diverse fields such as climate science, aerodynamics, and combustion. Turbulent flows are typically modeled by the NavierStokes equations. Direct Numerical Simulation (DNS) of the NavierStokes equations with sufficient numerical resolution to capture all the relevant scales of the turbulent motions can be prohibitively expensive. Simulation at lowerresolution on a coarsegrid introduces significant errors. We introduce a machine learning (ML) technique based on a deep neural network architecture that corrects the numerical errors induced by a coarsegrid simulation of turbulent flows at highReynolds numbers, while simultaneously recovering an estimate of the highresolution fields. Our proposed simulation strategy is a hybrid MLPDE solver that is capable of obtaining a meaningful highresolution solution trajectory while solving the system PDE at a lower resolution. The approach has the potential to dramatically reduce the expense of turbulent flow simulations. As a proofofconcept, we demonstrate our MLPDE strategy on a twodimensional turbulent (Rayleigh Number $Ra=10^9$) RayleighBénard Convection (RBC) problem.
 Publication:

arXiv eprints
 Pub Date:
 September 2020
 arXiv:
 arXiv:2010.00072
 Bibcode:
 2020arXiv201000072P
 Keywords:

 Physics  Computational Physics;
 Computer Science  Machine Learning;
 Physics  Geophysics
 EPrint:
 Corrected typographical errors in the previous version related to the incorrectly formatted accented character "\'e" appearing in various places in the manuscript