# LIDS & Stats Tea

Tea talks are 20 minute long informal chalk-talks for the purpose of sharing ideas and making others aware about some of the topics that may be of interest to the LIDS and Stats audience. If you are interested in presenting in the upcoming seminars, please email lids_stats_tea[at]mit[dot]edu

February 2, 2022

### Datamodels: Understanding Model Predictions as functions of Data

Sung Min (Sam) Park (CSAIL)

Current supervised machine learning models rely on an abundance of training data. Yet, understanding the underlying structure and biases of this data—and how they impact models—remains challenging. We present a new conceptual framework, datamodels,...

February 9, 2022

### Near-Optimal Learning of Extensive-Form Games with Imperfect Information

Tiancheng Yu (LIDS)

March 9, 2022

### WCFS Queues: A New Analysis Framework

Isaac Grosof (CMU)

In this talk, I investigate four queueing models that are key to understanding the behavior of modern computing systems. Each was previously considered intractable to analyze. However, we discovered a subtle similarity between these models, which we...

March 16, 2022

### Time varying regression with hidden linear dynamics

Horia Mania (LIDS)

We revisit a model for time-varying linear regression that assumes the unknown parameters evolve according to a linear dynamical system. Counterintuitively, we show that when the underlying dynamics are stable the parameters of this model can be...

March 30, 2022

### Non-parametric threshold for smoothed empirical Wasserstein distance

Zeyu Jia (LIDS)

Consider an empirical measure Pn induced by n iid samples from a d-dimensional K-subgaussian distribution P. We show that when K < σ, the Wasserstein distance W2(Pn ∗N(0,σ2Id),P∗N(0,σ2Id)) converges at the parametric rate O(1/n), and when K >...

April 6, 2022

### Training invariances and the low-rank phenomenon: beyond linear networks.

Thien Le (LIDS)

The implicit bias induced by the training of neural networks has become a topic of rigorous study. In the limit of gradient flow and gradient descent with appropriate step size, it has been shown that when one trains a deep linear network with...

April 13, 2022

### Higher-order network information improves overlapping community detection

Xinyi Wu (IDSS)

Identifying communities is a central problem in network science. While many methods exist, one major challenge in community detection is the often overlapping nature of communities such that nodes belong to multiple groups simultaneously. Link-based...

April 20, 2022

### Auditing algorithmic filtering on social media

Sarah Cen (LIDS)

By filtering the content that users see, social media platforms have the ability to influence users' perceptions and decisions, from their dining choices to their voting preferences. This influence has drawn scrutiny, with many calling for...

April 27, 2022

### The Climate Pocket: Exploring Local Climate Impacts with Physics-informed Machine Learning

Björn Lütjens (AeroAstro)

Would a global carbon tax reduce the flood risk at MIT? The answer to this question of local impact and risk is critical for policy making or climate-resilient infrastructure development. But, localized climate models are computationally too...

May 4, 2022