Technical Report ERGO 09-013

On the complexity of steepest descent, Newton's and regularized Newton's methods for nonconvex unconstrained optimization
Coralia Cartis, Nicholas I.M. Gould and Philippe L. Toint

Abstract:

It is shown that the steepest descent and Newton's method for unconstrained nonconvex optimization under standard assumptions may be both require a number of iterations and function evaluations arbitrarily close to O(ε-2) to drive the norm of the gradient below ε. This shows that the upper bound of O(ε-2) evaluations known for the steepest descent is tight, and that Newton's method may be as slow as steepest descent in the worst case. The improved evaluation complexity bound of O(ε-3/2) evaluations known for cubically-regularised Newton methods is also shown to be tight.

Download:

ERGO-09-013.pdf

History:

Written: 15 October 2009

Status:

Published in SIAM Journal on Optimization.