Private Stochastic Non-convex Optimization with Improved Utility Rates

Private Stochastic Non-convex Optimization with Improved Utility Rates

Qiuchen Zhang, Jing Ma, Jian Lou, Li Xiong

Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence
Main Track. Pages 3370-3376. https://doi.org/10.24963/ijcai.2021/464

We study the differentially private (DP) stochastic nonconvex optimization with a focus on its under-studied utility measures in terms of the expected excess empirical and population risks. While the excess risks are extensively studied for convex optimization, they are rarely studied for nonconvex optimization, especially the expected population risk. For the convex case, recent studies show that it is possible for private optimization to achieve the same order of excess population risk as to the nonprivate optimization under certain conditions. It still remains an open question for the nonconvex case whether such ideal excess population risk is achievable. In this paper, we progress towards an affirmative answer to this open problem: DP nonconvex optimization is indeed capable of achieving the same excess population risk as to the nonprivate algorithm in most common parameter regimes, under certain conditions (i.e., well-conditioned nonconvexity). We achieve such improved utility rates compared to existing results by designing and analyzing the stagewise DP-SGD with early momentum algorithm. We obtain both excess empirical risk and excess population risk to achieve differential privacy. Our algorithm also features the first known results of excess and population risks for DP-SGD with momentum. Experiment results on both shallow and deep neural networks when respectively applied to simple and complex real datasets corroborate the theoretical results.
Keywords:
Machine Learning: Learning Theory
Data Mining: Privacy Preserving Data Mining