Latent Variable Model Estimation via Collaborative Filtering

Monday, July 31, 2017 - 2:00pm to Tuesday, August 1, 2017 - 1:55pm

Event Calendar Category

LIDS Thesis Defense

Speaker Name

Christina E. Lee

Affiliation

Laboratory for Information & Decision Systems

Building and Room number

32-G449

Abstract

Similarity based collaborative filtering for matrix completion is a popular heuristic that has been used widely across industry in the previous decades to build recommendation systems, due to its simplicity and scalability. However, despite its popularity, there has been little theoretical foundation explaining its widespread success. In this thesis, we prove theoretical guarantees for collaborative filtering under a nonparametric latent variable model, which arises from the natural property of ``exchangeability'', i.e. invariance under relabeling of the dataset. The analysis suggests that similarity based collaborative filtering can be viewed as kernel regression for latent variable models, where the features are not directly observed and the kernel must be estimated from the data. In addition, while classical collaborative filtering typically requires a dense dataset, this thesis proposes a new collaborative filtering algorithm which compares larger radius neighborhoods of data to compute similarities, and show that the estimate converges even for very sparse datasets, which has implications towards sparse graphon estimation. The algorithms can be applied in a variety of settings, such as recommendations for online markets, analysis of social networks, or denoising crowdsourced labels.

THESIS COMMITTEE
Prof. Asuman Ozdaglar (Thesis Supervisor)
Prof. Devavrat Shah (Thesis Supervisor)
Dr. Christian Borgs
Dr. Jennifer Chayes