In many modern applications, difficulty in evaluating the posterior density makes performing even a single MCMC step slow. This difficulty can be caused by intractable likelihood functions, but also appears for routine problems with large data sets. Many researchers have responded by running approximate versions of MCMC algorithms. In this note, we develop quantitative bounds for showing the ergodicity of these approximate samplers. We then use these bounds to study the bias-variance trade-off of approximate MCMC algorithms. We apply our results to simple versions of recently proposed algorithms, including a variant of the "austerity" framework of Korratikara et al.