Coding sets with asymmetric information.

arXiv: Data Structures and Algorithms(2018)

引用 22|浏览115
暂无评分
摘要
We study the following one-way asymmetric transmission problem, also a variant of model-based compressed sensing: a resource-limited encoder has to report a small set $S$ from a universe of $N$ items to a more powerful decoder (server). The distinguishing feature is asymmetric information: the subset $S$ is comprised of i.i.d. samples from a prior distribution $\mu$, and $\mu$ is only known to the decoder. The goal for the encoder is to encode $S$ obliviously, while achieving the information-theoretic bound of $|S| \cdot H(\mu)$, i.e., the Shannon entropy bound. We first show that any such compression scheme must be {\em randomized}, if it gains non-trivially from the prior $\mu$. This stands in contrast to the symmetric case (when both the encoder and decoder know $\mu$), where the Huffman code provides a near-optimal deterministic solution. On the other hand, a rather simple argument shows that, when $|S|=k$, a random linear code achieves near-optimal communication rate of about $k\cdot H(\mu)$ bits. Alas, the resulting scheme has prohibitive decoding time: about ${N\choose k} \approx (N/k)^k$. Our main result is a computationally efficient and linear coding scheme, which achieves an $O(\lg\lg N)$-competitive communication ratio compared to the optimal benchmark, and runs in $\text{poly}(N,k)$ time. Our "multi-level" coding scheme uses a combination of hashing and syndrome-decoding of Reed-Solomon codes, and relies on viewing the (unknown) prior $\mu$ as a rather small convex combination of uniform ("flat") distributions.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要