Monday, April 10, 2023 - 3:00pm
Event Calendar Category
Other LIDS Events
Speaker Name
Jason Altschuler
Affiliation
NYU - Center for Data Science
Building and Room Number
32-D677
Shifted divergences provide a principled way of making information theoretic divergences (e.g. KL) geometrically aware via optimal transport smoothing. In this talk, I will argue that shifted divergences provide a powerful approach towards unifying optimization, sampling, differential privacy, and beyond. For concreteness, I will demonstrate these connections via three recent highlights. (1) The fastest high-accuracy algorithm for sampling from log-concave distributions. (2) Resolving the mixing time of the Langevin Algorithm to its stationary distribution for log-concave sampling. (3) Resolving the differential privacy of Noisy-SGD, the standard algorithm for private optimization in both theory and practice. A recurring theme is a certain notion of algorithmic stability, and the central technique for establishing this is shifted divergences.
Based on joint work with Kunal Talwar, and with Sinho Chewi.
Jason Altschuler is a CDS Faculty Fellow at NYU in 2022-2023, and an assistant professor at the UPenn Department of Statistics and Data Science starting July 2023. Previously, he received his PhD from MIT, and before that he received his undergraduate degree from Princeton. His research interests are broadly at the intersection of optimization, probability, and machine learning, with a recent focus on computational aspects of problems related to optimal transport.
https://www.mit.edu/~jasonalt/