Tuesday, May 12, 2020 - 3:00pm to Wednesday, May 13, 2020 - 3:55pm
Event Calendar Category
LIDS Seminar Series
Speaker Name
Alexander Olshevsky
Affiliation
Boston University
Event Recording
Zoom meeting id
214 770 417
Join Zoom meeting
https://mit.zoom.us/j/214770417
Distributed optimization has attracted a lot of attention recently as machine learning methods are increasingly trained in parallel over clusters of nodes. Unfortunately, the performance of many distributed optimization algorithms often suffers from scalability problems, with worst-case performance degrading the size of the network increases.
We present an overview of several recent works that manage to overcome this barrier. We will describe distributed first-order methods for convex optimization with associated performance bounds that do not depend on the network at all after a transient period. The final result is that a network of n nodes is able to converge to a global minimizer of the objective n times faster as compared to a single node.
Alex Olshevsky received the B.S. degree in applied mathematics and the B.S. degree in electrical engineering from the Georgia Institute of Technology, Atlanta, GA, USA, both in 2004, and the M.S. and Ph.D. degrees in electrical engineering and computer science from the Massachusetts Institute of Technology, Cambridge, MA, USA, in 2006 and 2010, respectively. He was a postdoctoral scholar at Princeton University from 2010 to 2012, and an Assistant Professor at the University of Illinois at Urbana-Champaign from 2012 to 2016. He is currently an Associate Professor with the ECE department at Boston University.
Dr. Olshevsky is a recipient of the NSF CAREER Award, the Air Force Young Investigator Award, the INFORMS Computing Society Prize for the best paper on the interface of operations research and computer science, a SIAM Award for annual paper from the SIAM Journal on Control and Optimization chosen to be reprinted in SIAM Review, and an IMIA award for best paper on clinical medical informatics in 2019.