Minimum Rates of Approximate Sufficient Statistics

Friday, October 12, 2018 - 2:00pm

Event Calendar Category

Other LIDS Events

Speaker Name

Vincent Tan


National University of Singapore

Building and Room Number



Given a sufficient statistic for a parametric family of distributions, one can estimate the parameter without access to the data. However, the memory or code size for storing the sufficient statistic may nonetheless still be prohibitive. Indeed, for n independent samples drawn from a k-nomial distribution with d=k−1 degrees of freedom, the length of the code scales as d log n + O(1). In many applications, we may not need to reconstruct the generating distribution exactly. By adopting a Shannon-theoretic approach in which we allow a small error in estimating the generating distribution, we construct various approximate sufficient statistics and show that the code length can be reduced to (d/2) log n + O(1). We consider errors measured according to the relative entropy and variational distance criteria. For the code constructions, we leverage Rissanen's minimum description length principle, which yields a non-vanishing error measured according to the relative entropy. For the converse parts, we use Clarke and Barron's formula for the relative entropy of a parametrized distribution and the corresponding mixture distribution. However, this method only yields a weak converse for the variational distance. We develop new techniques to achieve vanishing errors and we also prove strong converses. The latter means that even if the code is allowed to have a non-vanishing error, its length must still be at least (d/2) log n.
This is joint work with Prof. Masahito Hayashi (Nagoya University) and was published in the Feb 2018 issue of the IEEE Transactions on Information Theory ( 


Vincent Y. F. Tan was born in Singapore in 1981. He is currently an Associate Professor in the Department of Electrical and Computer Engineering (ECE) and the Department of Mathematics at the National University of Singapore (NUS). He received the B.A. and M.Eng. degrees in Electrical and Information Sciences from Cambridge University in 2005. He received the Ph.D. degree in Electrical Engineering and Computer Science (EECS) from the Massachusetts Institute of Technology in 2011. His research interests include information theory, machine learning and statistical signal processing. He is currently a Distinguished Lecturer of the IEEE Information Theory Society (2018/19).