Many tools are available to bound the convergence rate of Markov chains in total variation (TV) distance. Such results can be used to establish central limit theorems (CLT) that enable error evaluations of Monte Carlo estimates in practice. However, convergence analysis based on TV distance is often non-scalable to high-dimensional Markov chains (Qin and Hobert (2018); Rajaratnam and Sparks (2015)). Alternatively, robust bounds in Wasserstein distance are often easier to obtain, thanks to a coupling argument. Our work is concerned with the implication of such convergence results, in particular, do they lead to CLTs of the corresponding Markov chains? One indirect and typically non-trivial way is to first convert Wasserstein bounds into total variation bounds. Alternatively, we provide two CLTs that directly depend on (sub-geometric) convergence rates in Wasserstein distance. Our CLTs hold for Lipschitz functions under certain moment conditions. Finally, we apply these CLTs to four sets of Markov chain examples including a class of nonlinear autoregressive processes, an exponential integrator version of the metropolis adjusted Langevin algorithm (EI-MALA), an unadjusted Langevin algorithm (ULA), and a special autoregressive model that generates reducible chains.