Critical questions in dynamical neuroscience and machine learning are related to the study of recurrent neural networks and their stability, robustness, entrainment, and computational efficiency. These properties can all be established through the development of a comprehensive contractivity theory for neural networks. This paper makes three sets of contributions. First, regarding $\ell_1/\ell_\infty$ logarithmic norms, we establish quasiconvexity with respect to positive diagonal weights, monotonicity results for principal submatrices, and closed-form worst-case expressions over certain matrix polytopes. Second, regarding nonsmooth contraction theory, we show that the one-sided Lipschitz constant of a Lipschitz vector field equals the essential supremum of the logarithmic norm of its Jacobian. Third, we apply these general results to classes of recurrent neural circuits, including Hopfield, firing rate, Persidskii, Lur'e and other models. For each model, we compute the optimal contraction rate and weighted non-Euclidean norm via a linear program or, in some special cases, via a Hurwitz condition on Metzler majorant of the synaptic matrix. Our non-Euclidean analysis establishes also absolute and total contraction.