The Entropy of Backwards Analysis

Symposium on Discrete Algorithms(2017)

引用 0|浏览92
暂无评分
摘要
Backwards analysis, first popularized by Seidel, is often the simplest most elegant way of analyzing a randomized algorithm. It applies to incremental algorithms where elements are added incrementally, following some random permutation, e.g., incremental Delauney triangulation of a pointset, where points are added one by one, and where we always maintain the Delauney triangulation of the points added thus far. For backwards analysis, we think of the permutation as generated backwards, implying that the $i$th point in the permutation is picked uniformly at random from the $i$ points not picked yet in the backwards direction. Backwards analysis has also been applied elegantly by Chan to the randomized linear time minimum spanning tree algorithm of Karger, Klein, and Tarjan. The question considered in this paper is how much randomness we need in order to trust the expected bounds obtained using backwards analysis, exactly and approximately. For the exact case, it turns out that a random permutation works if and only if it is minwise, that is, for any given subset, each element has the same chance of being first. Minwise permutations are known to have $\Theta(n)$ entropy, and this is then also what we need for exact backwards analysis. However, when it comes to approximation, the two concepts diverge dramatically. To get backwards analysis to hold within a factor $\alpha$, the random permutation needs entropy $\Omega(n/\alpha)$. This contrasts with minwise permutations, where it is known that a $1+\varepsilon$ approximation only needs $\Theta(\log (n/\varepsilon))$ entropy. Our negative result for backwards analysis essentially shows that it is as abstract as any analysis based on full randomness.
更多
查看译文
关键词
entropy,analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要