Michael Overton (Courant Institute of Mathematical Sciences, NYU)

Nonsmooth, Nonconvex Optimization: Algorithms and Examples
Tuesday 9 July 2019 at 14.00, JCMB 5323

Abstract

In many applications one wishes to minimize an objective function that is not convex and is not differentiable at its minimizers. We discuss two algorithms for minimization of general nonsmooth, nonconvex functions. Gradient Sampling is a simple method that, although computationally intensive, has a nice convergence theory. The method is robust and the convergence theory has been extended to constrained problems. BFGS is a well known method, developed for smooth problems, but which is remarkably effective for nonsmooth problems too. Although our theoretical results in the nonsmooth case are quite limited, we have made extensive empirical observations and have had broad success with BFGS in nonsmooth applications. Limited Memory BFGS is a popular extension for large-scale problems, but we show that, in contrast to BFGS, it sometimes converges to non-optimal nonsmooth points. Throughout the talk we illustrate the ideas through examples, some very easy and some very challenging.

Seminars by year

Current 2019 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996