Monday, August 1, 2022 - 10:00am to 11:30am
Event Calendar Category
LIDS Thesis Defense
EECS & LIDS
Building and Room number
Join Zoom meeting
Semidefinite relaxation is a powerful tool used to approximate otherwise intractable non-convex problems, but tend to run into scalability issues. The goal of this thesis is to explore the power of semidefinite relaxations and address the scalability issues for special classes of problems with polynomial structure.
First, we consider semidefinite relaxations of functions on quadratic maps, with applications to approximating permanents of PSD matrices, product of quadratic forms, and a generalization of Max Cut. The optimization problems and their convex relaxations have a product structure which is crucial in the analysis of approximation quality. We show that these problems are all connected with a unified analysis which recovers tight approximation factors. In turn this leads to better approximation bounds on the permanent of PSD matrices, intermediate relaxations trading off accuracy with computational power, and constant factor approximation bounds for other convex objectives.
Second, we study the global landscape of low-rank sum of squares problems, using a non-convex Burer-Monteiro formulation to decrease the computational cost but with the risk of getting stuck in local minima. We show that in the univariate case where the SDP solution is guaranteed to be rank-2, this formulation does not have spurious local minima. This is in contrast to previous work showing that for general SDPs, in addition to genericity conditions, the rank has to be roughly the square root of the degree of the polynomial for there to be no spurious local minima. This, together with a near-linear time computation of the gradient, enables very fast first-order methods, scaling to polynomials with millions of variables
Pablo Parrilo, EECS (advisor)
Suvrit Sra, EECS
Ankur Moitra, Mathematics