A Deterministic Sampling Method via Maximum Mean Discrepancy Flow with Adaptive Kernel
Abstract
We propose a novel deterministic sampling method to approximate a target distribution $\rho^*$ by minimizing the kernel discrepancy, also known as the Maximum Mean Discrepancy (MMD). By employing the general \emph{energetic variational inference} framework (Wang et al., 2021), we convert the problem of minimizing MMD to solving a dynamic ODE system of the particles. We adopt the implicit Euler numerical scheme to solve the ODE systems. This leads to a proximal minimization problem in each iteration of updating the particles, which can be solved by optimization algorithms such as L-BFGS. The proposed method is named EVI-MMD. To overcome the long-existing issue of bandwidth selection of the Gaussian kernel, we propose a novel way to specify the bandwidth dynamically. Through comprehensive numerical studies, we have shown the proposed adaptive bandwidth significantly improves the EVI-MMD. We use the EVI-MMD algorithm to solve two types of sampling problems. In the first type, the target distribution is given by a fully specified density function. The second type is a "two-sample problem", where only training data are available. The EVI-MMD method is used as a generative learning model to generate new samples that follow the same distribution as the training data. With the recommended settings of the tuning parameters, we show that the proposed EVI-MMD method outperforms some existing methods for both types of problems.
- Publication:
-
arXiv e-prints
- Pub Date:
- November 2021
- DOI:
- 10.48550/arXiv.2111.10722
- arXiv:
- arXiv:2111.10722
- Bibcode:
- 2021arXiv211110722C
- Keywords:
-
- Statistics - Machine Learning;
- Computer Science - Machine Learning;
- Statistics - Computation
- E-Print:
- 25 pages, 9 figures