LIDS & Stats Tea

Tea talks are 20 minute long informal chalk-talks for the purpose of sharing ideas and making others aware about some of the topics that may be of interest to the LIDS and Stats audience. If you are interested in presenting in the upcoming seminars, please email lids_stats_tea[at]mit[dot]edu

February 8, 2017

Maximum likelihood estimation of determinantal point processes

Victor-Emmanuel Brunel (Math)

Determinantal point processes (DPPs) are a very useful and elegant tool to model repulsive interactions, hence they have become very popular in data science and machine learning, among other fields. In a learning prospective, many estimators of...

February 15, 2017

Censored Demand Estimation in Retail

Jehangir Amjad (LIDS)

Our goal is to estimate the true demand of a product at a given store location and time period in the retail environment based on a single noisy and potentially censored observation. We introduce a framework to make inference from multiple time...

February 22, 2017

Peter Krafft - talk canceled

Peter Krafft - talk canceled (Media Lab)

Today's LIDS & Stats Tea talk has been canceled - our apologies for any inconvenience.

March 1, 2017

Communication complexity and lifting theorems

Shalev Ben-David (CSAIL)

I'll talk about some recent developments in the field of communication complexity. Specifically, I'll explain the concept of a lifting theorem, which connects the communication complexity model to the more tractable query complexity model.

March 8, 2017

The minimax algorithm for a sequential prediction game with square loss

Alan Malek (IDSS)

Consider the following game-theoretic model of sequential prediction: at every round 1 through T, the learner plays an action, the opponent observes this action and plays a response, and the learner incurs the square difference as a loss. The...

March 22, 2017

Principal Differences Analysis: Interpretable Characterization of Differences between Distributions

Jonas Mueller (CSAIL)

I will introduce principal differences analysis (PDA), a method for analyzing differences between high-dimensional distributions which operates by finding the projection that maximizes the statistical divergence between the resulting univariate...

April 5, 2017

High Dimensional Linear Regression: Mean Squared Error and Phase Transitions

Ilias Zadik (ORC)

In this talk we will focus on the sparse high dimensional regression Y=X\beta^{*}+W where X is a n\times pmatrix with i.i.d. standard normal entries, W is a n\times 1 vector with i.i.d. N(0,\sigma^{2}) entries and \beta^{*} is a p\times 1 binary...

April 12, 2017

Graph Signal Processing with Applications to Network Topology Inference

Santiago Segarra (IDSS)

Advancing a holistic theory of networks necessitates fundamental breakthroughs in modeling, identification, and controllability of distributed network processes – often conceptualized as signals defined on the vertices of a graph. Under the assumption...

April 19, 2017

Ludwig Schmidt

Ludwig Schmidt (CSAIL)

April 26, 2017

Peter Krafft

Peter Krafft (Media Lab)

May 3, 2017

Igor Kadota

Igor Kadota (LIDS)

May 10, 2017

Diego Cifuentes

Diego Cifuentes (LIDS)

May 17, 2017

Christos Thrampoulidis

Christos Thrampoulidis (LIDS)