K. Fountoulakis and J. Gondzio
Abstract
In this paper a robust second-order method is developed for the
solution of strongly convex L1-regularized problems. The main aim
is to make the proposed method as inexpensive as possible, while
even difficult problems can be efficiently solved. The proposed
method is a primal-dual Newton Conjugate Gradients (pdNCG) method.
Convergence properties of pdNCG are studied and worst-case iteration
complexity is established. Numerical results are presented
on synthetic sparse least-squares problems and real world
machine learning problems.
Key words: L1-regularization, Strongly convex optimization, Second-order methods, Iteration Complexity, Newton Conjugate-Gradients method.