Neural Propagation of Beliefs
Abstract
We continue to explore the hypothesis that neuronal populations represent and process analog variables in terms of probability density functions (PDFs). A neural assembly encoding the joint probability density over relevant analog variables can in principle answer any meaningful question about these variables by implementing the Bayesian rules of inference. Aided by an intermediate representation of the probability density based on orthogonal functions spanning an underlying lowdimensional function space, we show how neural circuits may be generated from Bayesian belief networks. The ideas and the formalism of this PDF approach are illustrated and tested with several elementary examples, and in particular through a problem in which modeldriven topdown information flow influences the processing of bottomup sensory input.
 Publication:

arXiv eprints
 Pub Date:
 February 2001
 arXiv:
 arXiv:condmat/0102426
 Bibcode:
 2001cond.mat..2426B
 Keywords:

 Condensed Matter  Disordered Systems and Neural Networks;
 Quantitative Biology
 EPrint:
 22 pages, 5 figures, submitted to Neural Computation, v2. significant structural changes