Thursday, September 28, 2017 - 1:30pm
Event Calendar Category
LIDS Thesis Defense
Building and Room Number
We study efficient average-case (approximation) algorithms for combinatorial optimization problems, as well as explore the algorithmic obstacles for a variety of discrete optimization problems arising in the theory of random graphs, statistics and machine learning. In particular, we consider the average-case optimization for three NP-hard combinatorial optimization problems: Large Sub-matrix Selection of a Gaussian random matrix, Maximum Cut (Max-Cut) of sparse random graphs and Matrix Completion with entries sampled by sparse random regular graphs.
We propose various algorithms for these problems, some of which even beat the current state of the art algorithms for these problems. For Large Sub-matrix Selection, we show the existence of a very simple algorithm which produces a k-by-k sub-matrix of average value 4/3sqrt(2 log n/k) with high probability in a n-by-n random matrix with i.i.d. standard Gaussian entries, 4/3 better than the best algorithms so far for this problem. For Low-rank Matrix Completion problem, we propose a new matrix completion algorithm using a novel sampling scheme based on a union of independent sparse random regular bipartite graphs, which reconstructs an epsilon-approximation of the underlying n-by-n matrix using O(n log^2(1/epsilon)) samples in linear time.
I will also use the problem of Large Sub-matrix Selection as an example to illustrate one essential phase transition: the Overlap Gap Property (OGP) which appears in many combinatorial problems over random instances, for example maximum clique in random graph, maximum independent set in random graph, and random k-sat problems. I will discuss its connection with the hardness on average.