Thesis Defense: Divergence Covering

Wednesday, December 8, 2021 - 11:00am to 1:00pm

Event Calendar Category

LIDS Thesis Defense

Speaker Name

Jennifer Tang

Affiliation

EECS & LIDS

Building and Room number

32-D677

Zoom meeting id

jennifer

Join Zoom meeting

https://mit.zoom.us/j/95986368226

Abstract

Thesis Committee:  Prof. Yury Polyanskiy (Supervisor), Prof. Meir Feder, Prof. Alexander Rakhlin, Prof. Gregory Wornell

A longstanding problem of interest is that of finding covering numbers. A very important measure between probability distributions is Kullback-Leibler (KL) divergence. Both topics have been massively studied in various contexts, and in this thesis, we focus on studying the problem when the two concepts are combined. This combination yields interesting techniques for providing useful bounds on a number of important problems related to information theory.

Our goal is to explore covering the probability simplex in terms of KL divergence. Various properties of KL divergence (e.g. it is not a metric, not symmetric, and can easily blow up to infinity) make it unintuitive and difficult to analyze using traditional methods. We look at covering discrete large-alphabet probabilities both with worst-case divergence distance and average-case divergence distance and examine the implications of these divergence covering numbers. One implication of worst-case divergence covering is finding how to communicate probability distributions with limited communication bandwidth. Another implication is in universal compression and universal prediction, where the divergence covering number provides upper bounds on minimax risk. A third application is the computing capacity of the noisy permutation channel. We then use average-case divergence covering to study efficient algorithms for quantizing large-alphabet distributions in order to save storage space.