Near-Optimal SQ Lower Bounds for Agnostically Learning Halfspaces and ReLUs under Gaussian Marginals

NIPS 2020(2020)

引用 61|浏览149
暂无评分
摘要
We study the fundamental problems of agnostically learning halfspaces and ReLUs under Gaussian marginals. In the former problem, given labeled examples (𝐱, y) from an unknown distribution on ℝ^d ×{± 1}, whose marginal distribution on 𝐱 is the standard Gaussian and the labels y can be arbitrary, the goal is to output a hypothesis with 0-1 loss OPT+ϵ, where OPT is the 0-1 loss of the best-fitting halfspace. In the latter problem, given labeled examples (𝐱, y) from an unknown distribution on ℝ^d ×ℝ, whose marginal distribution on 𝐱 is the standard Gaussian and the labels y can be arbitrary, the goal is to output a hypothesis with square loss OPT+ϵ, where OPT is the square loss of the best-fitting ReLU. We prove Statistical Query (SQ) lower bounds of d^poly(1/ϵ) for both of these problems. Our SQ lower bounds provide strong evidence that current upper bounds for these tasks are essentially best possible.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要