Istvan Deak (Corvinus University, Budapest)

Applications of successive regression approximations in stochastic programming
Wednesday 11 March 2009 at 15.30, JCMB 6206

Abstract

A new method was proposed for solving optimization problems with noisy functions, the method is called Successive Regression Approximations (SRA). In optimization practice one often resorts to some kind of approximations. Linear and quadratic polynomials, orthogonal functions, Taylor series are the most frequently applied ones. For noisy functions generally derivatives are not available, so SRA is relying only on the function values.

SRA was tested on several problems, the related computer results were published in a series of papers. These problems include solving a one-dimensional equation, probabilistic constrained and two-stage stochastic programming problems, including large-scale ones, a combined model of Prekopa, quadratic stochastic programming problems. SRA is suitable for solving linear programming problems with random technology matrix, too.

Seminars by year

Current 2019 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996