Proximal-Primal-Dual Algorithms for Constrained Nonconvex Optimization Problems

Wednesday, October 20, 2021 - 4:00pm to 4:30pm

Event Calendar Category

LIDS & Stats Tea

Speaker Name

Jiawei Zhang



Building and Room Number

LIDS Lounge


In this talk, we introduce our recent works about proximal-primal-dual algorithms for constrained nonconvex optimization. The augmented Lagrangian method (ALM) and the alternating direction method of multipliers (ADMM) are popular for solving constrained optimization problems. They have excellent numerical behavior and strong convergence guarantee for convex problems  . However, their theoretical convergence for nonconvex problems is still unknown and they can oscillate even for simple nonconvex quadratic programming. To address this issue, we propose a smoothed prox-ALM and a smoothed prox-ADMM for solving nonconvex problems.  Using a novel prox-primal-dual framework and new error bounds, we prove that for nonconvex problems with polyhedral constraints our algorithms can achieve an $\mathcal{O}(1/\epsilon^2)$ iteration complexity, which is the lower iteration complexity bound of first-order algorithms for nonconvex optimization . Our algorithms can also be extended to solve nonconvex distributed optimization problems and nonconvex min-max problems.


Jiawei Zhang is a postdoc associate in LIDS. He attained his Ph.D degree from the Chinese University of Hong Kong (Shenzhen) advised by Professor Zhi-quan (Tom) Luo. His research  interest includes optimization theory and algorithms for nonconvex problems.