Fast Rates for Empirical Risk Minimization of Strict Saddle Problems
Conference on Learning Theory(2017)
摘要
We derive bounds on the sample complexity of empirical risk minimization (ERM) in the context of minimizing non-convex risks that admit the strict saddle property. Recent progress in non-convex optimization has yielded efficient algorithms for minimizing such functions. Our results imply that these efficient algorithms are statistically stable and also generalize well. In particular, we derive fast rates which resemble the bounds that are often attained in the strongly convex setting. We specify our bounds to Principal Component Analysis and Independent Component Analysis. Our results and techniques may pave the way for statistical analyses of additional strict saddle problems.
更多查看译文
关键词
empirical risk minimization,fast rates
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络