Localization, Uniform Convexity, and Star Aggregation

Wednesday, February 10, 2021 - 4:00pm to 4:30pm

Event Calendar Category

LIDS & Stats Tea

Speaker Name

Suhas Vijaykumar


MIT Sloan

Zoom meeting id

941 5418 9695

Join Zoom meeting



Offset Rademacher complexities have been shown to imply sharp, data-dependent upper bounds for the square loss in a broad class of problems including improper statistical learning and online learning. We show that in the statistical setting, the offset complexity upper bound can be generalized to any loss satisfying a certain uniform curvature condition; this condition is shown to also capture exponential concavity and self-concordance, uniting several apparently disparate results.

By a unified geometric argument, these bounds translate to improper learning in a non-convex class using Audibert's “star algorithm.” As applications, we recover the optimal rates for proper and improper learning with the p-loss for 1 < p, closing the gap for p > 2, and show that improper variants of ERM can attain fast rates for logistic regression and other generalized linear models.



Suhas Vijaykumar is a PhD student in Economics and Statistics. His research focuses on quantifying the uncertainty that arises in algorithmic predictions, and on parallels between statistical and online learning. In his spare time, he enjoys playing the guitar and rock climbing.