Prof. Russell Luke (Universität Göttingen):
ERROR BOUNDS AND QUANTIFIABLE CONVERGENCE OF PROXIMAL METHODS FOR NONSMOOTH/(NON)CONVEX OPTIMIZATION
Time and place
Thursday, 4.2.16, 17:00-18:00, Hörsaal II, Albertstr. 23b
Abstract
For iterative methods in nonconvex optimization, a central question is when to stop. And when the decision has been made to stop, what is the relation, if any, between the point that the algorithm delivers and the desired solution(s) to the optimization problem? Quantification of the convergence of algorithms is the key to providing error bounds for stopping crieria, and at the heart of quantifiable convergence rates lies theory of regularity, not only of the underlying functions and operators, but of the critical points of the optimization model. We survey progress over the last several years on sufficient conditions for local linear convergence of fundamental algorithms applied to nonconvex problems, and discuss challenges and prospects for further progress. The theory is local by nature and contains the convex case as an example where the local neighborhood extends to the whole space. We demonstrate the use of the tools we have developed on applications to image processing and matrix completion.