High performance solution of linear optimization problems
ALOP Autumn School, Trier, 31 August 2017
Julian Hall
Abstract
These two lectures and associated lab session present theory and techniques underlying important high performance solution methods for linear optimization problems. Since performance is principally dependent on the underlying numerical linear algebra it is this which will be explored in greatest detail. The first lecture focuses on serial techniques for the simplex method, in particular those used to exploit hyper-sparsity. The second lecture discusses parallel techniques for the simplex method as well as alternative methods for linear optimization. The lab session will give the opportunity to experiment with high performance serial numerical linear agebra techniques as well as study interior point methods and applications of linear optimization in machine learning and data science.
Slides:
- Lecture 1: Serial techniques for the simplex method: ALOP1.pdf
- Lecture 2: Parallel techniques for the simplex method and alternative methods for linear optimization: ALOP2.pdf
Lab materials:
The aim of this lab is to introduce some concepts and techniques which are fundamental to high performance numerical linear algebra, to formulate and solve two optimization problems and explore an interior point method implementation in MATLAB.
The lab assumes some basic knowledge of MATLAB and how to find out more using the help facility. Some instructions on how to time calculations and produce formatted output are given here.
- Worksheet (pdf)
- Worksheet Comments [NLA only!] (pdf)
- Numerical linear algebra
- Functions
- Matrices
- Solution: nla.m
- Support vector machine
- svm_training.mat
- svmplot.m
- Solution: svm.m
- Chebychev centre problem
- Solution: chebychev.m
- Interior point methods
Papers:
Parallelizing the dual revised simplex method
Q. Huangfu and J. A. J. Hall
Technical Report ERGO-14-011
Accepted for publication in Mathematical Programming ComputationNovel update techniques for the revised simplex method
Q. Huangfu and J. A. J. Hall
Technical Report ERGO-13-001
Computational Optimization and Applications 60(3), 587-608, 2015. DOI: 10.1007/s10589-014-9689-1
Awarded the prize for the best paper of 2015 in Computational Optimization and Applications-
Parallel distributed-memory simplex for large-scale stochastic LP problems
M. Lubin, J. A. J. Hall, C. G. Petra and M. Anitescu
Technical Report ERGO-12-005
Computational Optimization and Applications 55(3), 571-596, 2013. DOI: 10.1007/s10589-013-9542-y
Awarded the COIN-OR INFORMS 2013 Cup on 7 October 2013
Awarded the prize for the best paper of 2013 in Computational Optimization and Applications Hyper-sparsity in the revised simplex method and how to exploit it
J. A. J. Hall and K. I. M. McKinnon
Hyper-sparsity October 2002
Computational Optimization and Applications 32(3), 259-283, 2005. DOI: 10.1007/s10589-005-4802-0
Awarded the prize for the best paper of 2005 in Computational Optimization and Applications