Time-Space Hardness Of Learning Sparse Parities

STOC '17: Symposium on Theory of Computing Montreal Canada June, 2017(2017)

引用 54|浏览128
暂无评分
摘要
We define a concept class F to be time-space hard (or memory-samples hard) if any learning algorithm for F requires either a memory of size super-linear in n or a number of samples super-polynomial in n, where n is the length of one sample. A recent work shows that the class of all parity functions is time-space hard [Raz, FOCS' 16]. Building on [Raz, FOCS' 16], we show that the class of all sparse parities of Hamming weight l is time-space hard, as long as l >= omega (log n/ log log n). Consequently, linear-size DNF Formulas, linear-size Decision Trees and logarithmic-size Juntas are all time-space hard. Our result is more general and provides time-space lower bounds for learning any concept class of parity functions.We give applications of our results in the field of bounded-storage cryptography. For example, for every omega (log n) <= k <= n, we obtain an encryption scheme that requires a private key of length k, and time complexity of n per encryption/decryption of each bit, and is provably and unconditionally secure as long as the attacker uses at most o(nk) memory bits and the scheme is used at most 2(o(k)) times. Previously, this was known only for k = n [Raz, FOCS' 16].
更多
查看译文
关键词
lower bounds,bounded storage cryptography,branching program,Fourier analysis,PAC learning,time-space tradeoff
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要