Kernel learning approaches for summarising and combining posterior similarity matrices
Abstract
When using Markov chain Monte Carlo (MCMC) algorithms to perform inference for Bayesian clustering models, such as mixture models, the output is typically a sample of clusterings (partitions) drawn from the posterior distribution. In practice, a key challenge is how to summarise this output. Here we build upon the notion of the posterior similarity matrix (PSM) in order to suggest new approaches for summarising the output of MCMC algorithms for Bayesian clustering models. A key contribution of our work is the observation that PSMs are positive semidefinite, and hence can be used to define probabilisticallymotivated kernel matrices that capture the clustering structure present in the data. This observation enables us to employ a range of kernel methods to obtain summary clusterings, and otherwise exploit the information summarised by PSMs. For example, if we have multiple PSMs, each corresponding to a different dataset on a common set of statistical units, we may use standard methods for combining kernels in order to perform integrative clustering. We may moreover embed PSMs within predictive kernel models in order to perform outcomeguided data integration. We demonstrate the performances of the proposed methods through a range of simulation studies as well as two real data applications. R code is available at https://github.com/acabassi/combinepsms.
 Publication:

arXiv eprints
 Pub Date:
 September 2020
 DOI:
 10.48550/arXiv.2009.12852
 arXiv:
 arXiv:2009.12852
 Bibcode:
 2020arXiv200912852C
 Keywords:

 Statistics  Methodology;
 Statistics  Machine Learning
 EPrint:
 Manuscript: 27 pages, 8 figures. Supplement: 62 pages, 68 figures. For associated R code, see https://github.com/acabassi/combinepsms