Incorporating longrange physics in atomicscale machine learning
Abstract
The most successful and popular machine learning models of atomicscale properties derive their transferability from a locality ansatz. The properties of a large molecule or a bulk material are written as a sum over contributions that depend on the configurations within finite atomcentered environments. The obvious downside of this approach is that it cannot capture nonlocal, nonadditive effects such as those arising due to longrange electrostatics or quantum interference. We propose a solution to this problem by introducing nonlocal representations of the system, which are remapped as feature vectors that are defined locally and are equivariant in O(3). We consider, in particular, one form that has the same asymptotic behavior as the electrostatic potential. We demonstrate that this framework can capture nonlocal, longrange physics by building a model for the electrostatic energy of randomly distributed pointcharges, for the unrelaxed binding curves of charged organic molecular dimers, and for the electronic dielectric response of liquid water. By combining a representation of the system that is sensitive to longrange correlations with the transferability of an atomcentered additive model, this method outperforms current stateoftheart machinelearning schemes and provides a conceptual framework to incorporate nonlocal physics into atomistic machine learning.
 Publication:

Journal of Chemical Physics
 Pub Date:
 November 2019
 DOI:
 10.1063/1.5128375
 arXiv:
 arXiv:1909.04512
 Bibcode:
 2019JChPh.151t4105G
 Keywords:

 Physics  Chemical Physics
 EPrint:
 doi:10.1063/1.5128375