Collaborative Nested Sampling: Big Data versus Complex Physical Models
Abstract
The data torrent unleashed by current and upcoming astronomical surveys demands scalable analysis methods. Many machine learning approaches scale well, but separating the instrument measurement from the physical effects of interest, dealing with variable errors, and deriving parameter uncertainties is often an afterthought. Classic forwardfolding analyses with Markov chain Monte Carlo or nested sampling enable parameter estimation and model comparison, even for complex and slowtoevaluate physical models. However, these approaches require independent runs for each data set, implying an unfeasible number of model evaluations in the Big Data regime. Here I present a new algorithm, collaborative nested sampling, for deriving parameter probability distributions for each observation. Importantly, the number of physical model evaluations scales sublinearly with the number of data sets, and no assumptions about homogeneous errors, Gaussianity, the form of the model, or heterogeneity/completeness of the observations need to be made. Collaborative nested sampling has immediate applications in speeding up analyses of large surveys, integralfieldunit observations, and Monte Carlo simulations.
 Publication:

Publications of the Astronomical Society of the Pacific
 Pub Date:
 October 2019
 DOI:
 10.1088/15383873/aae7fc
 arXiv:
 arXiv:1707.04476
 Bibcode:
 2019PASP..131j8005B
 Keywords:

 Statistics  Computation;
 Astrophysics  Instrumentation and Methods for Astrophysics;
 Physics  Data Analysis;
 Statistics and Probability;
 Statistics  Machine Learning
 EPrint:
 Resubmitted to PASP Focus on Machine Intelligence in Astronomy and Astrophysics after first referee report. Figure 6 demonstrates the scaling for Collaborative MultiNest, PolyChord and RadFriends implementations. Figure 10 application to MUSE IFU data. Implementation at https://github.com/JohannesBuchner/massivedatans