LIDS & Stats Tea

Tea talks are 20 minute long informal chalk-talks for the purpose of sharing ideas and making others aware about some of the topics that may be of interest to the LIDS and Stats audience. If you are interested in presenting in the upcoming seminars, please email lids_stats_tea[at]mit[dot]edu

February 7, 2018

On Unlimited Sampling

Ayush Bhandari (MIT Media Lab)

Shannon's sampling theorem provides a link between the continuous and the discrete realms stating that bandlimited signals are uniquely determined by its values on a discrete set. This theorem is realized in practice using so called analog-to-...

February 14, 2018

Experiment-of-experiments Designs for Causal Inference with Network Interference

Jean Pouget Abadie (Harvard)

Tech companies rely on online experiments to understand the impact of changes to their ecosystem or product. The classic assumption, on which these experiments depend, of no interference amongst users—that is, the fact that the outcome of a user...

February 21, 2018

A Voronoi-Based Approach to Multi-Agent Pursuit-Evasion Games

Alyssa Pierson (CSAIL)

Multi-agent pursuit-evasion games are relevant to a number of emerging importance, such as security and surveillance, search and rescue, and wildlife monitoring. Originally inspired by the ``cops and robbers’’ game, the goal is to design pursuer...

February 28, 2018

Breaking the n^(-1/2) Barrier for Permutation-Based Ranking Models

Cheng Mao (Mathematics Department, MIT)

There has been a recent surge of interest in studying permutation-based models, such as the noisy sorting (NS) model and the strong stochastic transitivity (SST) model, for ranking from pairwise comparisons. Although permutation-based ranking models...

March 7, 2018

Private Sequential Learning

Zhi Xu (LIDS)

We formulate a private learning model to study an intrinsic tradeoff between privacy and query complexity in sequential learning. Our model involves a learner who aims to determine a scalar value, v*, by sequentially querying an external database...

March 14, 2018

High Dimensional Linear Regression using Lattice Basis Reduction

Ilias Zadik (ORC)

In this talk, we focus on the high dimensional linear regression problem where the goal is to efficiently recover an unknown vector β* from n noisy linear observations Y = Xβ* + W ∈ ℝn, for known X ∈ ℝn × p and unknown W ∈ ℝn. Unlike most of the...

March 21, 2018

Relaxed Locally Correctable Codes

Govind Ramnarayan (CSAIL)

Locally decodable codes (resp. locally correctable codes), or LDCs (resp. LCCs), are codes for which individual symbols of the message (resp. codeword) and be recovered by reading just a few bits from a noisy codeword, which is corrupted with ...

April 4, 2018

Minimal I-MAP MCMC for Scalable Structure Discovery in Causal DAG Models

Raj Agrawal (LIDS)

Learning a Bayesian network (BN) from data can be useful for decision-making or discovering causal relationships, but traditional methods can fail in modern applications, which often exhibit a larger number of observed variables than data points....

April 11, 2018

INSPECTRE: Privately Estimating the Unseen

Gautam Kamath (CSAIL)

Gautam Kamath will discuss some his recent work on estimating distribution properties, with the additional constraint of guaranteeing the privacy of the sample. Some properties of interest include entropy, support size, and support coverage, which...

April 18, 2018

Time Series Analysis via Matrix Estimation

Anish Agarwal (LIDS)

We consider the task of interpolating and forecasting a time series in the presence of noise and missing data. As the main contribution of this work, we introduce an algorithm that transforms the observed time series into a matrix, utilizes singular...

April 25, 2018


Suhas Kowshik (LIDS)

May 9, 2018


Omer Tanovic (LIDS)

May 16, 2018


Igor Gilitschenski (CSAIL)