Neural computation from first principles: Using the maximum entropy method to obtain an optimal bits-per-joule neuron
Abstract
Optimization results are one method for understanding neural computation from Nature's perspective and for defining the physical limits on neuron-like engineering. Earlier work looks at individual properties or performance criteria and occasionally a combination of two, such as energy and information. Here we make use of Jaynes' maximum entropy method and combine a larger set of constraints, possibly dimensionally distinct, each expressible as an expectation. The method identifies a likelihood-function and a sufficient statistic arising from each such optimization. This likelihood is a first-hitting time distribution in the exponential class. Particular constraint sets are identified that, from an optimal inference perspective, justify earlier neurocomputational models. Interactions between constraints, mediated through the inferred likelihood, restrict constraint-set parameterizations, e.g., the energy-budget limits estimation performance which, in turn, matches an axonal communication constraint. Such linkages are, for biologists, experimental predictions of the method. In addition to the related likelihood, at least one type of constraint set implies marginal distributions, and in this case, a Shannon bits/joule statement arises.
- Publication:
-
arXiv e-prints
- Pub Date:
- June 2016
- DOI:
- 10.48550/arXiv.1606.03063
- arXiv:
- arXiv:1606.03063
- Bibcode:
- 2016arXiv160603063L
- Keywords:
-
- Quantitative Biology - Neurons and Cognition;
- Statistics - Machine Learning
- E-Print:
- IEEE Trans. Molecular, Biological, and Multi-scale Communication, v2, Dec. 2016, 154-165