Title:Â A Stochastic Gradient Method with an Exponential Convergence Rate for Finite Training Sets
Speaker: | Ryan Kinnear |
Affiliation: | University of À¶Ý®ÊÓÆµ |
Room: | MC 5479 |
Abstract:
We will be discussing the paper (having the same title) by Roux, Schmidt, and Bach.  The authors propose a new stochastic gradient method for optimizing the sum of
 a finite set of smooth functions, where the sum is strongly convex.
 While standard stochastic gradient methods
 converge at sublinear rates for this problem, the proposed method incorporates a memory of previous gradient values in order to achieve a linear convergence 
rate. In a machine learning context, numerical experiments indicate that the new algorithm can dramatically outperform standard
 algorithms, both in terms of optimizing the training error and reducing the test error quickly.