Adaptive Learning Rate via Covariance Matrix Based Preconditioning for Deep Neural Networks

Adaptive Learning Rate via Covariance Matrix Based Preconditioning for Deep Neural Networks

Yasutoshi Ida, Yasuhiro Fujiwara, Sotetsu Iwamura

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 1923-1929. https://doi.org/10.24963/ijcai.2017/267

Adaptive learning rate algorithms such as RMSProp are widely used for training deep neural networks. RMSProp offers efficient training since it uses first order gradients to approximate Hessian-based preconditioning. However, since the first order gradients include noise caused by stochastic optimization, the approximation may be inaccurate. In this paper, we propose a novel adaptive learning rate algorithm called SDProp. Its key idea is effective handling of the noise by preconditioning based on covariance matrix. For various neural networks, our approach is more efficient and effective than RMSProp and its variant.
Keywords:
Machine Learning: Deep Learning
Machine Learning: Machine Learning
Machine Learning: Neural Networks
Robotics and Vision: Vision and Perception