### Nathan Srebro (Technion / TTI Chicago / University of Chicago)

#### Distributed stochastic optimization

*Joint work with Ohad Shamir.*

*Wednesday 20 August 2014 at 15.00, JCMB 6206*

##### Abstract

We consider the problem of distributed stochastic optimization, where each of
several machines has access to samples from the same source distribution, and
the goal is to jointly optimize the expected objective w.r.t. the source
distribution, minimizing: (1) overall runtime; (2) communication costs; (3)
number of samples used. We study this problem systematically, highlighting
fundamental limitations, and differences versus distributed consensus problems
where each machine has a different, independent, objective. We show how the
best known guarantees are obtained by a mini-batched SGD approach, and
contrast the runtime and sample costs of the approach with those of other
distributed optimization algorithms, including distributed ADMM, Newton-like
and quasi-Newton approaches.

### Seminars by year

*Current*
*2019*
*2018*
*2017*
*2016*
*2015*
*2014*
*2013*
*2012*
*2011*
*2010*
*2009*
*2008*
*2007*
*2006*
*2005*
*2004*
*2003*
*2002*
*2001*
*2000*
*1999*
*1998*
*1997*
*1996*