Slow synaptic dynamics in a network: From exponential to powerlaw forgetting
Abstract
We investigate a meanfield model of interacting synapses on a directed neural network. Our interest lies in the slow adaptive dynamics of synapses, which are driven by the fast dynamics of the neurons they connect. Cooperation is modeled from the usual Hebbian perspective, while competition is modeled by an original polaritydriven rule. The emergence of a critical manifold culminating in a tricritical point is crucially dependent on the presence of synaptic competition. This leads to a universal 1/t powerlaw relaxation of the mean synaptic strength along the critical manifold and an equally universal 1/√t relaxation at the tricritical point, to be contrasted with the exponential relaxation that is otherwise generic. In turn, this leads to the natural emergence of long and shortterm memory from different parts of parameter space in a synaptic network, which is the most original and important result of our present investigations.
 Publication:

Physical Review E
 Pub Date:
 September 2014
 DOI:
 10.1103/PhysRevE.90.032709
 arXiv:
 arXiv:1409.4185
 Bibcode:
 2014PhRvE..90c2709L
 Keywords:

 87.19.lv;
 87.18.Sn;
 87.10.Mn;
 05.40.a;
 Learning and memory;
 Neural networks;
 Stochastic modeling;
 Fluctuation phenomena random processes noise and Brownian motion;
 Condensed Matter  Disordered Systems and Neural Networks;
 Quantitative Biology  Neurons and Cognition
 EPrint:
 12 pages, 8 figures. Phys. Rev. E (2014) to appear