K. Fountoulakis and J. Gondzio
In this paper a second-order method for the solution of large-scale strongly convex L1-regularized problems is developed. The proposed method is a Newton-CG (Conjugate Gradients) algorithm with backtracking line-search embedded in a doubly-continuation scheme. Worst-case iteration complexity of the proposed Newton-CG is established. Based on the analysis of Newton-CG; worst-case iteration complexity of the doubly-continuation scheme is obtained. Numerical results are presented on large-scale problems for the doubly-continuation Newton-CG algorithm, which show that the proposed second-order method competes favourably with state-of-the-art first-order methods. In addition, L1-regularized Sparse Least-Squares problems are discussed for which a parallel block coordinate descent method stagnates.
Key words: L1-regularization, Strongly convex optimization, Second-order methods, Iteration Complexity, Continuation, Newton Conjugate-Gradients method.