Sample Complexity Bounds for Estimating Probability Divergences under Invariances

Wednesday, October 25, 2023 - 4:00pm

Event Calendar Category

LIDS & Stats Tea

Speaker Name

Behrooz Tahmasebi

Affiliation

CSAIL

Building and Room Number

LIDS Lounge

Group-invariant probability distributions appear in many data-generative models in machine learning, such as graphs, point clouds, and images. In practice, one often needs to estimate divergences between such distributions. In this work, we study how the inherent invariances, with respect to any smooth action of a Lie group on a manifold, improve sample complexity when estimating the Wasserstein distance, the Sobolev Integral Probability Metrics (Sobolev IPMs), the Maximum Mean Discrepancy (MMD), and also the complexity of the density estimation problem (in the $L^2$ and $L^\infty$ distance). Our results indicate a two-fold gain: (1) reducing the sample complexity by a multiplicative factor corresponding to the group size (for finite groups) or the normalized volume of the quotient space (for groups of positive dimension); (2) improving the exponent in the convergence rate (for groups of positive dimension). These results are completely new for groups of positive dimension and extend recent bounds for finite group actions.

Behrooz Tahmasebi is a Ph.D. Student in EECS at MIT, under the supervision of Prof. Stefanie Jegelka. His research interests include deep learning theory, learning with group invariances, and learning with graphs.