Kristian Woodsend and J. Gondzio
Abstract
Support vector machine training can be represented as a large
quadratic program. We present an efficient and numerically stable
algorithm for this problem using interior point methods, which
requires only order n operations per iteration. Through exploiting
the separability of the Hessian, we provide a unified approach,
from an optimization perspective, to 1-norm classification,
2-norm classification and epsilon-insensitive regression.
Numerical experiments indicate that, in contrast to existing
decomposition methods, the algorithm is largely unaffected
by noisy data, for both linear and non-linear kernels, and they
show our implementation outperforming all known implementations
by a large margin. We discuss the effect of using multiple
correctors, and monitoring the angle of the normal
to the hyperplane to determine termination.
Key words: Support vector machines, Interior point method, Separable quadratic program, Large Scale.