Monday, October 22, 2018 - 4:00pm to Tuesday, October 23, 2018 - 4:55pm
Event Calendar Category
LIDS Seminar Series
Building and Room Number
Modern data sets are often distributed across multiple machines and processors, and bandwidth and energy limitations in networks and within multiprocessor systems often impose significant bottlenecks on the performance of algorithms. Motivated by this trend, we consider the problem of estimating high-dimensional distributions and parameters in a distributed network, where each node in the network observes an independent sample from the underlying distribution and can communicate it to a central processor by writing at most k bits on a public blackboard. We obtain matching upper and lower bounds for the minimax risk of estimating the underlying distribution or parameter under various common statistical models. Our results show that the impact of the communication constraint can be qualitatively different depending on the tail behavior of the score function associated with each model. The key ingredient in our proof is a geometric characterization of Fisher information from quantized samples.
Joint work with Leighton Barnes, Yanjun Han, and Tsachy Weissman.
Ayfer Ozgur received her Ph.D. degree in 2009 from the Information Processing Group at EPFL, Switzerland. In 2010 and 2011, she was a post-doctoral scholar at the same institution. She is an Assistant Professor in the Electrical Engineering Department at Stanford University since 2012. Her research interests include distributed communication and learning, wireless systems, and information theory. Dr. Ozgur received the EPFL Best Ph.D. Thesis Award in 2010, an NSF CAREER award in 2013, the Okawa Foundation Research Grant and the IEEE Communication Theory Technical Committee (CTTC) Early Achievement Award in 2018.