Non-parametric threshold for smoothed empirical Wasserstein distance

Wednesday, March 30, 2022 - 4:00pm to 4:30pm

Event Calendar Category

LIDS & Stats Tea

Speaker Name

Zeyu Jia

Affiliation

LIDS

Building and Room Number

LIDS Lounge

Abstract

Consider an empirical measure Pn induced by n iid samples from a d-dimensional K-subgaussian distribution P. We show that when K < σ, the Wasserstein distance W2(Pn ∗N(0,σ2Id),P∗N(0,σ2Id)) converges at the parametric rate O(1/n), and when K > σ, there exists a K-subgaussian distribution P such that W2(Pn ∗N(0,σ2Id),P∗N(0,σ2Id)) = ω(1/n). This resolves the open problems in[GGNWP20], closes the gap between where we get parametric rate and where we do not have parametric rate. Our result provides a complete characterization of the range of parametric rates for subgaussian P . In addition, when σ < K, we establish more delicate results about the convergence rate of W2 distance squared. Assuming the distribution is one dimensional, we provide both the lower bound and the upper bound, demonstrating that the rate changes gradually from Θ(1/ n) to Θ(1/n) as σ/K goes from 0 to 1. Moreover, we also establish that DKL(Pn ∗ N (0, σ2Id)∥P ∗ N (0, σ2Id)) = O ̃(1/n). These results indicate a dichotomy of the convergence rate between the W2 distance squared and the KL divergence, resulting in the failure of T2-transportation inequality when σ < K, hence also resolving the open problem in [WW+16] about whether K < σ is necessary in proving whether the log-Sobolev inequality holds for P∗N(0,σ2).

Biography

Zeyu Jia is a second-year PhD student in EECS and LIDS. He is supervised by Prof. Sasha Rakhlin and Prof. Yury Polyanskiy. He obtained his bachelor’s degree of mathematics from Peking University in 2020. His general research interest is learning theory, especially reinforcement learning theory. But he is also interested in statistics and decision making as well.