#
Efficient global optimization: refinements and extensions

###
Invited seminar at the School of Mathematics of Cardiff University: 26th February 2003

###
J. A. J. Hall, K. I. M. McKinnon and T. Mayer

####
Abstract

In many practical optimization problems, the number of function
evaluations is severely limited by time or cost. This practical
consideration has driven the development of efficient methods for
global optimization which require only small numbers of function
evaluations. This talk will consider, in particular, the method of
Jones *et al.* in which the objective is modelled by a linear
predictor which interpolates the function at a set of sample points. A
corresponding standard error function models the uncertainty in the
predictor at points not yet sampled. By optimizing a merit function of
the predictor and standard error, the best new sample point is
determined. This talk will focus on efficient methods for optimizing
the merit function and the extension of efficient methods for global
optimization when the gradient of the objective at sample points is
available, as well as discussing assumptions about the nature of the
objective function which underpin the method.

**Slides:**

Postscript EGO_26.02.03.ps

Compressed Postscript EGO_26.02.03.ps.Z

G-Zipped Postscript EGO_26.02.03.ps.gz

PDF EGO_26.02.03.pdf
Note that the postscript version is of finer resolution.

Last modified: