Information Spectrum Converse for Minimum Entropy Couplings and Functional Representations

CoRR(2023)

引用 1|浏览0
暂无评分
摘要
Given two jointly distributed random variables $(X,Y)$, a functional representation of $X$ is a random variable $Z$ independent of $Y$, and a deterministic function $g(\cdot, \cdot)$ such that $X=g(Y,Z)$. The problem of finding a minimum entropy functional representation is known to be equivalent to the problem of finding a minimum entropy coupling where, given a collection of probability distributions $P_1, \dots, P_m$, the goal is to find a coupling $X_1, \dots, X_m$ ($X_i \sim P_i)$ with the smallest entropy $H_\alpha(X_1, \dots, X_m)$. This paper presents a new information spectrum converse, and applies it to obtain direct lower bounds on minimum entropy in both problems. The new results improve on all known lower bounds, including previous lower bounds based on the concept of majorization. In particular, the presented proofs leverage both - the information spectrum and the majorization - perspectives on minimum entropy couplings and functional representations.
更多
查看译文
关键词
deterministic function,information spectrum converse,jointly distributed random variables,minimum entropy coupling,minimum entropy functional representation,probability distributions
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要