February 2, 2022
Speaker: Sung Min (Sam) Park (CSAIL)
Current supervised machine learning models rely on an abundance of training data. Yet, understanding the underlying structure and biases of this data—and how they impact models—remains challenging. We present a new conceptual framework,...
February 9, 2022
Speaker: Tiancheng Yu (LIDS)
His paper resolves a longstanding open question of designing near-optimal algorithms for learning imperfect-information extensive-form games from bandit feedback. We present the first line of algorithms that require only $\widetilde{\mathcal...
February 16, 2022
Speaker: Eren Can Kizildag (LIDS)
It has been shown very recently that the symmetric binary perceptron (SBP) exhibits an extreme form of clustering at all positive densities: almost all of its solutions are singletons separated by large distances. This suggests that finding...
February 23, 2022
Speaker: Adam Block (IDSS)
It has long been thought that high-dimensional data encountered in many practical machine learning tasks have low-dimensional structure, i.e., the manifold hypothesis holds. A natural question, thus, is to estimate the intrinsic dimension of...
March 2, 2022
Speaker: Raaz Dwivedi (LIDS & Harvard)
We consider the problem of counterfactual inference in sequentially designed experiments wherein a collection of $\mathbf{N}$ units each undergo a sequence of interventions for $\mathbf{T}$ time periods, based on policies that sequentially...
March 9, 2022
Speaker: Isaac Grosof (CMU)
In this talk, I investigate four queueing models that are key to understanding the behavior of modern computing systems. Each was previously considered intractable to analyze. However, we discovered a subtle similarity between these models,...
March 16, 2022
Speaker: Horia Mania (LIDS)
We revisit a model for time-varying linear regression that assumes the unknown parameters evolve according to a linear dynamical system. Counterintuitively, we show that when the underlying dynamics are stable the parameters of this model...
March 30, 2022
Speaker: Zeyu Jia (LIDS)
Consider an empirical measure Pn induced by n iid samples from a d-dimensional K-subgaussian distribution P. We show that when K < σ, the Wasserstein distance W2(Pn ∗N(0,σ2Id),P∗N(0,σ2Id)) converges at the parametric rate O(1/n), and when...
April 6, 2022
Speaker: Thien Le (LIDS)
The implicit bias induced by the training of neural networks has become a topic of rigorous study. In the limit of gradient flow and gradient descent with appropriate step size, it has been shown that when one trains a deep linear network...
April 13, 2022
Speaker: Xinyi Wu (IDSS)
Identifying communities is a central problem in network science. While many methods exist, one major challenge in community detection is the often overlapping nature of communities such that nodes belong to multiple groups simultaneously....
April 20, 2022
Speaker: Sarah Cen (LIDS)
By filtering the content that users see, social media platforms have the ability to influence users' perceptions and decisions, from their dining choices to their voting preferences. This influence has drawn scrutiny, with many calling for...
April 27, 2022
Speaker: Björn Lütjens (AeroAstro)
Would a global carbon tax reduce the flood risk at MIT? The answer to this question of local impact and risk is critical for policy making or climate-resilient infrastructure development. But, localized climate models are computationally too...