Learning rate and attractor size of the single-layer perceptron
Abstract
We study the simplest possible order one single-layer perceptron with two inputs, using the delta rule with online learning, in order to derive closed form expressions for the mean convergence rates. We investigate the rate of convergence in weight space of the weight vectors corresponding to each of the 14 out of 16 linearly separable rules. These vectors follow zigzagging lines through the piecewise constant vector field to their respective attractors. Based on our studies, we conclude that a single-layer perceptron with N inputs will converge in an average number of steps given by an Nth order polynomial in (t)/(l) , where t is the threshold, and l is the size of the initial weight distribution. Exact values for these averages are provided for the five linearly separable classes with N=2 . We also demonstrate that the learning rate is determined by the attractor size, and that the attractors of a single-layer perceptron with N inputs partition RN⊕RN .
- Publication:
-
Physical Review E
- Pub Date:
- February 2007
- DOI:
- Bibcode:
- 2007PhRvE..75b6704S
- Keywords:
-
- 07.05.Mh;
- 05.45.-a;
- 84.35.+i;
- 87.18.Sn;
- Neural networks fuzzy logic artificial intelligence;
- Nonlinear dynamics and chaos;
- Neural networks;
- Neural networks