Differentiable Algorithm for Marginalising Changepoints
Abstract
We present an algorithm for marginalising changepoints in timeseries models that assume a fixed number of unknown changepoints. Our algorithm is differentiable with respect to its inputs, which are the values of latent random variables other than changepoints. Also, it runs in time O(mn) where n is the number of time steps and m the number of changepoints, an improvement over a naive marginalisation method with O(n^m) time complexity. We derive the algorithm by identifying quantities related to this marginalisation problem, showing that these quantities satisfy recursive relationships, and transforming the relationships to an algorithm via dynamic programming. Since our algorithm is differentiable, it can be applied to convert a model nondifferentiable due to changepoints to a differentiable one, so that the resulting models can be analysed using gradientbased inference or learning techniques. We empirically show the effectiveness of our algorithm in this application by tackling the posterior inference problem on synthetic and realworld data.
 Publication:

arXiv eprints
 Pub Date:
 November 2019
 arXiv:
 arXiv:1911.09839
 Bibcode:
 2019arXiv191109839L
 Keywords:

 Computer Science  Machine Learning;
 Statistics  Computation;
 Statistics  Machine Learning
 EPrint:
 To appear at AAAI 2020