When Is Amplification Necessary for Composition in Randomized Query Complexity?

APPROX/RANDOM(2020)

引用 2|浏览106
暂无评分
摘要
Suppose we have randomized decision trees for an outer function $f$ and an inner function $g$. The natural approach for obtaining a randomized decision tree for the composed function $(f\circ g^n)(x^1,\ldots,x^n)=f(g(x^1),\ldots,g(x^n))$ involves amplifying the success probability of the decision tree for $g$, so that a union bound can be used to bound the error probability over all the coordinates. The amplification introduces a logarithmic factor cost overhead. We study the question: When is this log factor necessary? We show that when the outer function is parity or majority, the log factor can be necessary, even for models that are more powerful than plain randomized decision trees. Our results are related to, but qualitatively strengthen in various ways, known results about decision trees with noisy inputs.
更多
查看译文
关键词
randomized query complexity,amplification,composition
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要