Accelerated Incremental Gradient Descent using Momentum Acceleration with Scaling Factor
Accelerated Incremental Gradient Descent using Momentum Acceleration with Scaling Factor
Yuanyuan Liu, Fanhua Shang, Licheng Jiao
Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 3045-3051.
https://doi.org/10.24963/ijcai.2019/422
Recently, research on variance reduced incremental gradient descent methods (e.g., SAGA) has made exciting progress (e.g., linear convergence for strongly convex (SC) problems). However, existing accelerated methods (e.g., point-SAGA) suffer from drawbacks such as inflexibility. In this paper, we design a novel and simple momentum to accelerate the classical SAGA algorithm, and propose a direct accelerated incremental gradient descent algorithm. In particular, our theoretical result shows that our algorithm attains a best known oracle complexity for strongly convex problems and an improved convergence rate for the case of n>=L/\mu. We also give experimental results justifying our theoretical results and showing the effectiveness of our algorithm.
Keywords:
Machine Learning: Classification
Machine Learning Applications: Other Applications