NonEuclidean Contractivity of Recurrent Neural Networks
Abstract
Critical questions in dynamical neuroscience and machine learning are related to the study of recurrent neural networks and their stability, robustness, entrainment, and computational efficiency. These properties can all be established through the development of a comprehensive contractivity theory for neural networks. This paper makes three sets of contributions. First, regarding $\ell_1/\ell_\infty$ logarithmic norms, we establish quasiconvexity with respect to positive diagonal weights, monotonicity results for principal submatrices, and closedform worstcase expressions over certain matrix polytopes. Second, regarding nonsmooth contraction theory, we show that the onesided Lipschitz constant of a Lipschitz vector field equals the essential supremum of the logarithmic norm of its Jacobian. Third, we apply these general results to classes of recurrent neural circuits, including Hopfield, firing rate, Persidskii, Lur'e and other models. For each model, we compute the optimal contraction rate and weighted nonEuclidean norm via a linear program or, in some special cases, via a Hurwitz condition on Metzler majorant of the synaptic matrix. Our nonEuclidean analysis establishes also absolute and total contraction.
 Publication:

arXiv eprints
 Pub Date:
 October 2021
 arXiv:
 arXiv:2110.08298
 Bibcode:
 2021arXiv211008298D
 Keywords:

 Mathematics  Optimization and Control;
 Electrical Engineering and Systems Science  Systems and Control