Conditional Sampling of Variational Autoencoders via Iterated Approximate Ancestral Sampling
Abstract
Conditional sampling of variational autoencoders (VAEs) is needed in various applications, such as missing data imputation, but is computationally intractable. A principled choice for asymptotically exact conditional sampling is Metropolis-within-Gibbs (MWG). However, we observe that the tendency of VAEs to learn a structured latent space, a commonly desired property, can cause the MWG sampler to get "stuck" far from the target distribution. This paper mitigates the limitations of MWG: we systematically outline the pitfalls in the context of VAEs, propose two original methods that address these pitfalls, and demonstrate an improved performance of the proposed methods on a set of sampling tasks.
- Publication:
-
arXiv e-prints
- Pub Date:
- August 2023
- DOI:
- 10.48550/arXiv.2308.09078
- arXiv:
- arXiv:2308.09078
- Bibcode:
- 2023arXiv230809078S
- Keywords:
-
- Computer Science - Machine Learning;
- Statistics - Machine Learning;
- 62D10;
- G.3
- E-Print:
- Published in Transactions on Machine Learning Research (TMLR), 2023