We develop and study the iteration complexity of a class of randomized parallel lock-free (asynchronous) first-order methods applied to the problem of minimizing a partially separable convex function. Our methods are especially well suited for big data optimization applications.
In special cases our generic algorithm reduces to parallel gradient descent, parallel stochastic gradient descent, parallel randomized block coordinate descent and Hogwild! [1]. In all cases our results are the first complexity estimates for lock-free variants of the methods, with the exception of Hogwild!, which was analyzed before, and for which we give vastly improved complexity results.
We contrast the approach with the efficiency of synchronous parallel coordinate descent methods [2] applied to the same problem.
References:
[1] F. Niu, B. Recht, C. Re, and S. Wright, Hogwild!: A lock-free approach to parallelizing stochastic gradient descent, NIPS 2011.
[2] P. Richtarik and M. Takac, Parallel coordinate descent methods for big data optimization, arXiv:1212:0873, 2012.
Current 2019 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996