Collaborative Nested Sampling: Big Data versus Complex Physical Models
Abstract
The data torrent unleashed by current and upcoming astronomical surveys demands scalable analysis methods. Many machine learning approaches scale well, but separating the instrument measurement from the physical effects of interest, dealing with variable errors, and deriving parameter uncertainties is often an afterthought. Classic forward-folding analyses with Markov chain Monte Carlo or nested sampling enable parameter estimation and model comparison, even for complex and slow-to-evaluate physical models. However, these approaches require independent runs for each data set, implying an unfeasible number of model evaluations in the Big Data regime. Here I present a new algorithm, collaborative nested sampling, for deriving parameter probability distributions for each observation. Importantly, the number of physical model evaluations scales sub-linearly with the number of data sets, and no assumptions about homogeneous errors, Gaussianity, the form of the model, or heterogeneity/completeness of the observations need to be made. Collaborative nested sampling has immediate applications in speeding up analyses of large surveys, integral-field-unit observations, and Monte Carlo simulations.
- Publication:
-
Publications of the Astronomical Society of the Pacific
- Pub Date:
- October 2019
- DOI:
- 10.1088/1538-3873/aae7fc
- arXiv:
- arXiv:1707.04476
- Bibcode:
- 2019PASP..131j8005B
- Keywords:
-
- Statistics - Computation;
- Astrophysics - Instrumentation and Methods for Astrophysics;
- Physics - Data Analysis;
- Statistics and Probability;
- Statistics - Machine Learning
- E-Print:
- Resubmitted to PASP Focus on Machine Intelligence in Astronomy and Astrophysics after first referee report. Figure 6 demonstrates the scaling for Collaborative MultiNest, PolyChord and RadFriends implementations. Figure 10 application to MUSE IFU data. Implementation at https://github.com/JohannesBuchner/massivedatans