Efficient Inexact Proximal Gradient Algorithm for Nonconvex Problems
Efficient Inexact Proximal Gradient Algorithm for Nonconvex Problems
Quanming Yao, James T. Kwok, Fei Gao, Wei Chen, Tie-Yan Liu
Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 3308-3314.
https://doi.org/10.24963/ijcai.2017/462
While proximal gradient algorithm is originally designed for convex optimization, several variants have been recently proposed for nonconvex problems. Among them, nmAPG [Li and Lin, 2015] is the state-of-art. However, it is inefficient when the proximal step does not have closed-form solution, or such solution exists but is expensive, as it requires more than one proximal steps to be exactly solved in each iteration. In this paper, we propose an efficient accelerate proximal gradient (niAPG) algorithm for nonconvex problems. In each iteration, it requires only one inexact (less expensive) proximal step. Convergence to a critical point is still guaranteed, and a O(1/k) convergence rate is derived. Experiments on image inpainting and matrix completion problems demonstrate that the proposed algorithm has comparable performance as the state-of-the-art, but is much faster.
Keywords:
Machine Learning: Machine Learning
Machine Learning: Structured Learning