Comms-IT Seminar: Modern Estimation of Information Theoretic Functionals: Theory, Algorithms, and Applications

Wednesday, November 15, 2017 - 1:30pm

Event Calendar Category

Uncategorized

Speaker Name

Jiantao Jiao

Affiliation

Stanford University

Building and Room number

36- 428

Abstract

Modern inferential tasks—ranging from graphical model learning to image registration to inference of gene regulatory networks—frequently involve estimation of information theoretic functionals such as entropy, mutual information, Kullback–Leibler divergence, and total variation distance. This talk will focus on recent progress in our understanding of the performance, structure, and deployment of near-optimal estimators for such functionals. We present general methods for constructing near-optimal estimators of these functionals in three observation models: (1) i.i.d. data from a discrete distribution, (2) a sample path from a finite-state Markov chain, and (3) i.i.d. data from a continuous distribution. In the first model, we discuss two distinct approaches, with one aimed at estimating the functionals directly, and the other aimed at reconstructing the sorted probability vector. Both methods give rise to near-optimal functional estimators. For estimating the entropy rate of a Markov chain, we provide a non-asymptotic analysis of the sample complexity in terms of the size of the state space and mixing time. We illustrate our findings by estimating the entropy rates of the English language from the Penn Treebank and the Google 1 Billion Word Dataset, which serve as benchmarks for the perplexity in language modeling. For differential entropy estimation, we characterize the minimax behavior over Besov balls, and show that a fixed-k nearest neighbor estimator adaptively achieves the minimax rates up to logarithmic factors without cognizance of the smoothness of the density. In all three models, the near-optimal estimators are efficiently computable and exhibit a "sample size enlargement" phenomenon, i.e., they attain with n samples what prior methods would have needed n log(n) samples to achieve.

Based on collaborations with Weihao Gao, Yanjun Han, Chuan-Zheng Lee, Kartik Venkat, Pramod Viswanath, Tsachy Weissman, Yihong Wu, and Tiancheng Yu.

Biography

Jiantao Jiao is a Ph.D. student in the Department of Electrical Engineering at Stanford University. He received the B.Eng. degree in Electronic Engineering from Tsinghua University, Beijing, China in 2012, and the M.Eng. degree in Electrical Engineering from Stanford University in 2014. He is a recipient of the Presidential Award of Tsinghua University and the Stanford Graduate Fellowship. He was a semi-plenary speaker at ISIT 2015 and a co-recipient of the ISITA 2016 Student Paper Award. He co-designed and co-taught the graduate course EE378A (Statistical Signal Processing) at Stanford University in 2016 and 2017, with his advisor Tsachy Weissman. His research interests are in statistical machine learning and data science, and their applications in medical imaging, genomics, and natural language processing. He is a co-founder of Qingfan (www.qingfan.com), an online platform that democratizes technical training and job opportunities for anyone with access to the internet.