Wednesday, February 23, 2022 - 4:00pm to 4:30pm
Event Calendar Category
LIDS & Stats Tea
Building and Room Number
It has long been thought that high-dimensional data encountered in many practical machine learning tasks have low-dimensional structure, i.e., the manifold hypothesis holds. A natural question, thus, is to estimate the intrinsic dimension of a given population distribution from a finite sample. In this talk, I will introduce a new estimator of the intrinsic dimension and provide finite sample, non-asymptotic guarantees under weak assumptions on the geometry of the support. If time permits, I will then describe how to apply the techniques to get new sample complexity bounds for Generative Adversarial Networks (GANs) depending only on the intrinsic dimension of the data.
Adam Block is a third year PhD student in Mathematics, advised by Sasha Rakhlin. He is primarily interested in algorithms and regret bounds for learning in settings that mitigate worst-case statistical or computational lower bounds.