### Michael L. Overton (Courant Institute, New York University)

#### Nonsmooth, nonconvex optimization

*Monday 16 July 2012 at 15.00, JCMB 5215 - Joint with NAIS*

##### Abstract

There are many algorithms for minimization when the objective function is
differentiable, convex, or has some other known structure, but few options
when none of the above hold, particularly when the objective function is
nonsmooth at minimizers, as is often the case in applications. We discuss two
algorithms for minimization of nonsmooth, nonconvex functions. Gradient
Sampling is a simple method that, although computationally intensive, has a
nice convergence theory. The method is robust and the convergence theory has
recently been extended to constrained problems. BFGS is a well known method,
developed for smooth problems, but which is remarkably effective for nonsmooth
problems too. Although our theoretical results in the nonsmooth case are quite
limited, we have made some remarkable empirical observations and have had
broad success in applications. Limited Memory BFGS is a popular extension for
large problems, and it is also applicable to the nonsmooth case, although our
experience with it is more mixed.

### Seminars by year

*Current*
*2016*
*2015*
*2014*
*2013*
*2012*
*2011*
*2010*
*2009*
*2008*
*2007*
*2006*
*2005*
*2004*
*2003*
*2002*
*2001*
*2000*
*1999*
*1998*
*1997*
*1996*