An Analog of Nesterov Acceleration for MCMC

Wednesday, December 9, 2020 - 4:00pm to 4:30pm

Event Calendar Category

LIDS & Stats Tea

Speaker Name

Xiang Cheng



Zoom meeting id

966 5151 6867

Join Zoom meeting


We formulate gradient-based Markov chain Monte Carlo sampling as an optimization on the space of probability measures, with Kullback-Leibler divergence as the objective functional. We show that an underdamped form of the Langevin algorithm performs accelerated gradient descent in this metric. To characterize the convergence of the algorithm, we construct a Lyapunov functional and exploit hypocoercivity of the underdamped Langevin algorithm. As an application, we show that accelerated rates can be obtained for a class of nonconvex functions with the Langevin algorithm.


Xiang is a postdoc in LIDS, hosted by Suvrit Sra. He is interested in the analysis of stochastic processes and their applications in machine learning. Xiang received his PhD in Computer Science from UC Berkeley, advised by Peter Bartlett and Michael I. Jordan.