Consensus-Based Optimization

Wednesday, May 5, 2021 - 11:00am to 12:00pm

Event Calendar Category

LIDS Seminar Series

Speaker Name

Massimo Fornasier

Affiliation

Technical University of Munich

Join Zoom meeting

https://mit.zoom.us/j/95607814981

Consensus-based optimization (CBO) is a multi-agent metaheuristic derivative-free optimization method that can globally minimize nonconvex nonsmooth functions and is amenable to theoretical analysis. In fact, optimizing agents (particles) move on the optimization domain driven by a drift towards an instantaneous consensus point, which is computed as a convex combination of particle locations, weighted by the cost function according to Laplace’s principle, and it represents an approximation to a global minimizer. The dynamics is further perturbed by a random vector field to favor exploration, whose variance is a function of the distance of the particles to the consensus point. In particular, as soon as the consensus is reached the stochastic component vanishes. Based on an experimentally supported intuition that CBO always performs a gradient descent of the squared Euclidean distance to the global minimizer, we show a novel technique for proving the global convergence to the global minimizer in mean-field law for a rich class of objective functions. The result unveils internal mechanisms of CBO that are responsible for the success of the method. In particular, we present the proof that CBO performs a convexification of a very large class of optimization problems as the number of optimizing agents goes to infinity. We further present formulations of CBO over compact hypersurfaces and the proof of convergence to global minimizers for nonconvex nonsmooth optimizations on the hypersphere. We conclude the talk with several numerical experiments, which show that CBO scales well with the dimension and is extremely versatile. To quantify the performances of such a novel approach, we show that CBO is able to perform essentially as good as ad hoc state of the art methods using higher order information in challenging problems in signal processing and machine learning, namely the phase retrieval problem and the robust subspace detection.

The research of Massimo Fornasier embraces a broad spectrum of problems in mathematical modeling, analysis, and numerical analysis. Fornasier is particularly interested in the concept of compression as appearing in different forms in data analysis, image and signal processing, and in the adaptive numerical solutions of partial differential equations or high-dimensional optimization problems. Fornasier received his doctoral degree in computational mathematics in 2003 from the University of Padua, Italy. After spending from 2003 to 2006 as a postdoctoral research fellow at the University of Vienna and University of Rome La Sapienza, he joined the Johann Radon Institute for Computational and Applied Mathematics (RICAM) of the Austrian Academy of Sciences where he served as a senior research scientist until March 2011. He was an associate researcher from 2006 to 2007 for the Program in Applied and Computational Mathematics of Princeton University, USA. In 2011 Fornasier was appointed Chair of Applied Numerical Analysis at TUM. He is a member of VQR, a panel responsible for the evaluation of the quality of research in Italy. He is also a member of the editorial boards of Networks and Heterogeneous Media, Journal of Fourier Analysis and Applications, and Calcolo.