Faster Algorithms for MaxProduct MessagePassing
Abstract
Maximum A Posteriori inference in graphical models is often solved via messagepassing algorithms, such as the junctiontree algorithm, or loopy beliefpropagation. The exact solution to this problem is well known to be exponential in the size of the model's maximal cliques after it is triangulated, while approximate inference is typically exponential in the size of the model's factors. In this paper, we take advantage of the fact that many models have maximal cliques that are larger than their constituent factors, and also of the fact that many factors consist entirely of latent variables (i.e., they do not depend on an observation). This is a common case in a wide variety of applications, including grids, trees, and ringstructured models. In such cases, we are able to decrease the exponent of complexity for messagepassing by 0.5 for both exact and approximate inference.
 Publication:

arXiv eprints
 Pub Date:
 October 2009
 arXiv:
 arXiv:0910.3301
 Bibcode:
 2009arXiv0910.3301M
 Keywords:

 Computer Science  Artificial Intelligence;
 Computer Science  Data Structures and Algorithms;
 F.2.2;
 I.2
 EPrint:
 34 pages, 22 figures