Tuesday, November 22, 2016 - 4:00pm to Wednesday, November 23, 2016 - 3:55pm
Event Calendar Category
LIDS Seminar Series
Building and Room Number
The complexity of network optimization depends on the network topology, the nature of the objective function, and what information (local or global) is available to the decision makers. In this talk we introduce a notion of network locality and explore some of its properties and applications to algorithms. In particular, we investigate message-passing, a simple, iterative, distributed, general-purpose paradigm for problems with a graph structure. We establish a general framework to analyze the convergence of message-passing in convex problems with constraints. As an application, we analyze the behavior of message-passing to solve systems of linear equations in the Laplacian matrices of graphs, and to compute electric flows. These are two fundamental primitives that arise in several domains such as computer science, electrical engineering, operations research, and machine learning. We show that the behavior of message-passing in these instances is connected to certain diffusion properties of random walks on graphs.
Joint work with Sekhar Tatikonda.
Patrick Rebeschini is a Lecturer in the CS Department at Yale University, where he works on developing methodologies and theoretical foundations for machine learning. He held a two-years Postdoctoral Associate position at the Yale Institute for Network Science, hosted by Sekhar Tatikonda, and has a Ph.D. in Operations Research and Financial Engineering from Princeton University, where he worked in applied probability under the supervision of Ramon van Handel.