Tuesday, November 24, 2015 - 4:00pm to Wednesday, November 25, 2015 - 3:55pm
Event Calendar Category
LIDS Seminar Series
Building and Room Number
In this talk I will describe a new method for bounding mutual information. It consists of two steps. First, we show that under regularity conditions two high-dimensional random variables that are close in terms of expected distance necessarily have similar Shannon entropy. In other words, we show that entropy is Lipschitz-continuous in Wasserstein distance. Second, given a complicated random variable we couple it to a simpler one having independent coordinates. Existence of good couplings follows from classical information-transportation inequalities due to Marton and Talagrand. Surprisingly, putting these two steps together allows us to improve state-of-the-art outer bounds for capacity region of interference channels. In particular, we resolve Costa's "missing corner-point" conjecture (1985).
Joint work with Yihong Wu (UIUC).
Yury Polyanskiy is an Associate Professor of Electrical Engineering and Computer Science and a member of LIDS at MIT. Yury received M.S. degree in applied mathematics and physics from the Moscow Institute of Physics and Technology in 2005 and Ph.D. degree in electrical engineering from Princeton University in 2010. In 2000-2005 he lead the development of the embedded software in the Department of Oilfield Surface Equipment, Borets Company. Yury won the 2013 NSF CAREER award and 2011 IEEE Information Theory Society Paper Award.