Wednesday, October 16, 2024 - 4:00pm
Event Calendar Category
LIDS & Stats Tea
Speaker Name
Yassir Jedra
Affiliation
LIDS
Building and Room number
32-D650
Building and Room Number
LIDS Lounge
“Exploiting Observation Bias to Improve Matrix Completion”
We consider a variant of matrix completion where entries are revealed in a biased manner. Our aim is to address the extent to which such bias can be exploited in improving predictions. Towards that, we propose a natural model where the observation pattern and outcome of interest are driven by the same set of underlying latent (or unobserved) factors. We devise Mask Nearest Neighbor (MNN), a novel two-stage matrix completion algorithm: first, it recovers (distances between) the latent factors by utilizing matrix estimation for the fully observed noisy binary matrix, corresponding to the observation pattern; second, it utilizes the recovered latent factors as features and sparsely observed noisy outcomes as labels to perform non-parametric supervised learning. Our analysis reveals that MNN enjoys entry-wise finite-sample error rates that are competitive with corresponding supervised learning parametric rates. Despite not having access to the latent factors and dealing with biased observations, MNN exhibits such competitive performance via only exploiting the shared information between the bias and outcomes. Finally, through empirical evaluation using a real-world dataset, we find that with MNN, the estimates have 28x smaller mean squared error compared to traditional matrix completion methods, suggesting the utility of the model and method proposed in this work.
This is based on the following work: https://arxiv.org/html/2306.04775v2
Yassir Jedra is currently a postdoctoral researcher in LIDS at MIT working with Devavrat Shah. Previously, he completed his PhD in Electrical Engineering from KTH Royal Institute of Technology, where he was advised by Alexandre Proutiere. During his PhD, he was also part of the WASP AI program. Before that, he received a MSc degree in Applied and Computational Mathematics within the Engineering Physics' program at KTH and both his BSc degree and MSc degree in Mathematics and Computer Science from ENSIMAG -- Grenoble-INP, in 2015 and 2018, respectively.
His research interests broadly revolve around developing algorithmic foundations for sequential learning and control of dynamical systems under uncertainty. In particular, he has been working on topics in reinforcement learning, multi-armed bandits, system identification and adaptive control, with the aim of understanding how structural characterisations (of practical relevance) can be exploited to enable fast and efficient learning. Recently, he has been also working on topics at the intersection of recommender systems and causal inference.
ABOUT LIDS and STATS TEA TALKS:
Tea talks are 20-minute informal talks for the purpose of sharing ideas and creating awareness about topics of interest to the LIDS and Stats communities. Talks are followed by light refreshments and stimulating conversation.
Email lids_stats_teas[at]mit[dot]edu for information about LIDS & Stats Tea Talks
Sign-up to present at LIDS & Stats Tea Talks
LIDS & Stats Tea Talks Committee: Maison Clouatre, Subham Saha, Ashkan Soleymani, Jia Wan