An Introduction to Hamiltonian Monte Carlo Method for Sampling
Abstract
The goal of this article is to introduce the Hamiltonian Monte Carlo (HMC) method  a Hamiltonian dynamicsinspired algorithm for sampling from a Gibbs density $\pi(x) \propto e^{f(x)}$. We focus on the "idealized" case, where one can compute continuous trajectories exactly. We show that idealized HMC preserves $\pi$ and we establish its convergence when $f$ is strongly convex and smooth.
 Publication:

arXiv eprints
 Pub Date:
 August 2021
 arXiv:
 arXiv:2108.12107
 Bibcode:
 2021arXiv210812107V
 Keywords:

 Computer Science  Data Structures and Algorithms;
 Computer Science  Machine Learning;
 Mathematics  Probability;
 Statistics  Computation;
 Statistics  Machine Learning
 EPrint:
 This exposition is to supplement the talk by the author at the Bootcamp in the semester on Geometric Methods for Optimization and Sampling at the Simons Institute for the Theory of Computing