arxivst stuff from arxiv that you should probably bookmark

Variance Reduced Stochastic Gradient Descent with Sufficient Decrease

Abstract · Mar 20, 2017 15:43 ·

cs-lg math-oc stat-ml

Arxiv Abstract

  • Fanhua Shang
  • Yuanyuan Liu
  • James Cheng
  • Kelvin Kai Wing Ng
  • Yuichi Yoshida

The sufficient decrease technique has been widely used in deterministic optimization, even for non-convex optimization problems, such as line-search techniques. Motivated by those successes, we propose a novel sufficient decrease framework for a class of variance reduced stochastic gradient descent (VR-SGD) methods such as SVRG and SAGA. In order to make sufficient decrease for stochastic optimization, we design a new sufficient decrease criterion. We then introduce a coefficient \theta to satisfy the sufficient decrease property, which takes the decisions to shrink, expand or move in the opposite direction (i.e., \theta x for the variable x), and give two specific update rules for Lasso and ridge regression. Moreover, we analyze the convergence properties of our algorithms for strongly convex problems, which show that both of our algorithms attain linear convergence rates. We also provide the convergence guarantees of both of our algorithms for non-strongly convex problems. Our experimental results further verify that our algorithms achieve better performance than their counterparts.

Read the paper (pdf) »