Abstract
We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability distribution We prove a convergence guarantee in KullbackLeibler (KL) divergence assuming satisfies log-Sobolev inequality and f has bounded Hessian. Notably, we do not assume convexity or bounds on higher derivatives. We also prove convergence guarantees in Rényi divergence of order q > 1 assuming the limit of ULA satisfies either log-Sobolev or Poincaré inequality.