Rate Entropy Function

Dazhuan Xu,Tian Liu, Desen Zhang, Manman Zhao

Journal of Data Acquisition & Processing(2023)

引用 0|浏览2
暂无评分
摘要
We consider the rate entropy function, which is the infimum of average mutual information with conditional entropy distortion constraint. Conditional entropy distortion is actually the expected distortion of logarithmic loss. It can be regarded a generalized distortion since it does not satisfy non-negativity. Like rate distortion function, the rate entropy function is essentially a variation problem with no general solution. So we propose three methods to construct a particular solution. Accordingly, closed-form expressions of rate entropy function for the uniform source and several additive sources are derived. Particularly, neither squared error nor magnitude error distortion exists for the Cauchy source due to the absence of first-order and higher-order moments, but the entropy distortion does. Entropy distortion varies depending on the source. For Gaussian sources, entropy distortion is equivalent to squared error distortion. In the case of vector Gaussian sources, entropy distortion is equivalent to the determinant of error covariance matrix, which is different from both covariance matrix distortion and trace distortion.
更多
查看译文
关键词
Entropy,Distortion,Rate-distortion,Additives,Distortion measurement,Covariance matrices,Mutual information,Rate entropy function,logarithmic loss,entropy distortion measure,uniform source,additive source
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要