Block sparsity and gauge mediated weight sharing for learning dynamical laws from data
Abstract
Recent years have witnessed an increased interest in recovering dynamical laws of complex systems in a largely datadriven fashion under meaningful hypotheses. In this work, we propose a method for scalably learning dynamical laws of classical dynamical systems from data. As a novel ingredient, to achieve an efficient scaling with the system size, block sparse tensor trains  instances of tensor networks applied to function dictionaries  are used and the self similarity of the problem is exploited. For the latter, we propose an approach of gauge mediated weight sharing, inspired by notions of machine learning, which significantly improves performance over previous approaches. The practical performance of the method is demonstrated numerically on three onedimensional systems  the FermiPastaUlamTsingou system, rotating magnetic dipoles and classical particles interacting via modified LennardJones potentials. We highlight the ability of the method to recover these systems, requiring 1400 samples to recover the 50 particle FermiPastaUlamTsingou system to residuum of $5\times10^{7}$, 900 samples to recover the 50 particle magnetic dipole chain to residuum of $1.5\times10^{4}$ and 7000 samples to recover the LennardJones system of 10 particles to residuum $1.5\times10^{2}$. The robustness against additive Gaussian noise is demonstrated for the magnetic dipole system.
 Publication:

arXiv eprints
 Pub Date:
 August 2022
 arXiv:
 arXiv:2208.01591
 Bibcode:
 2022arXiv220801591G
 Keywords:

 Mathematics  Dynamical Systems;
 Physics  Computational Physics;
 Quantum Physics
 EPrint:
 13 pages, 6 figures