Theoretical Study of Two Prediction-Centric Problems: Graphical Model Learning and Recommendations

Wednesday, August 30, 2017 - 10:00am to Thursday, August 31, 2017 - 9:55am

Event Calendar Category

LIDS Thesis Defense

Speaker Name

Mina Karzand

Affiliation

Laboratory for Information and Decision Systems

Building and Room number

36-428

Abstract

PART I. Learning a tree-structured Ising model:

We study the problem of learning a tree Ising model from samples such that subsequent predictions based on partial observations are accurate. Virtually all previous work on graphical model learning has focused on recovering the true underlying graph. We define a distance ("small set TV” or ssTV) between distributions P and Q by taking the maximum, over all subsets S of a given size, of the total variation between the marginals of P and Q on S; this distance captures the accuracy of the prediction task of interest.

We derive non-asymptotic bounds on the number of samples needed to get a distribution (from the same class) with small ssTV relative to the one generating the samples. An implication is that far fewer samples are needed for accurate predictions than for recovering the underlying tree.

PART II. Optimal online algorithms for a latent variable model of recommendation systems:

We consider an online model for recommendation systems, with each user being recommended an item at each time-step and providing 'like or ‘dislike' feedback. The user preferences are specified via a latent variable model: both users and items are clustered into types. The model captures structure in both the item and user spaces, and our focus is on simultaneous use of both structures. In the case when the type preference matrix is randomly generated, we provide a sharp analysis of the best possible regret obtainable by any algorithm.

Thesis Committee:
Prof. Lizhong Zheng (advisor), Prof. Guy Bresler (advisor)
Prof. Muriel Medard, Prof. Gregory Wornell