Katya Scheinberg (Lehigh University, USA)

Accelerating first order methods for convex composite optimization problems
Tuesday 26 June 2012 at 14.30, JCMB 6206

Abstract

First-order methods with favorable convergence rates have recently become a focal point of much research in the field of convex optimization. These methods have low per-iteration complexity and hence are applicable to very large scale model, such as the ones arising in signal processing, statistics and machine learning. We will first show how these convergence properties extend to a certain class of alternating direction methods - also recently popular for large scale convex problems. All the methods in question employ prox term parameter which is often assumed to be fixed. We will discuss theoretical and practical implications of various strategies for choosing the prox parameter in prox gradient methods and related alternating direction methods. We will show extension of existing convergence rates for both accelerated and classical first-order methods. Practical comparison based on a testing environment for L1 optimization will be presented.

Seminars by year

Current 2019 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996