Finite-size analysis in neural network classification of critical phenomena
Abstract
We analyze the problem of supervised learning of ferromagnetic phase transitions from the statistical physics perspective. We consider two systems in two universality classes, the two-dimensional Ising model and two-dimensional Baxter-Wu model, and perform careful finite-size analysis of the results of the supervised learning of the phases of each model. We find that the variance of the neural network (NN) output function (VOF) as a function of temperature has a peak in the critical region. Qualitatively, the VOF is related to the classification rate of the NN. We find that the width of the VOF peak displays the finite-size scaling governed by the correlation length exponent ν of the universality class of the model. We check this conclusion using several NN architectures—a fully connected NN, a convolutional NN, and several members of the ResNet family—and discuss the accuracy of the extracted critical exponents ν .
- Publication:
-
Physical Review E
- Pub Date:
- September 2023
- DOI:
- 10.1103/PhysRevE.108.L032102
- arXiv:
- arXiv:2305.03342
- Bibcode:
- 2023PhRvE.108c2102C
- Keywords:
-
- Condensed Matter - Statistical Mechanics;
- Condensed Matter - Disordered Systems and Neural Networks
- E-Print:
- 5 pages, 2 figures + Supplementary materials