Relevant Information Loss Rate

INFORMATION LOSS IN DETERMINISTIC SIGNAL PROCESSING SYSTEMS(2018)

引用 0|浏览1
暂无评分
摘要
The examples in the previous chapter showed that relative information loss yields counter-intuitive results for many practical systems such as quantizers, center clippers, etc. Also PCA was shown not to be useful in information-theoretic terms, at least if we do not know anything about the input data. All these results can be traced back to the fact that relative information loss treats every bit of information contained in the input RV X equally. In this chapter, we introduce the notion of relevant information loss: Not all the information at the input of a system is important, but only the part that is statistically related to a relevant RV. After defining relevant information loss and discussing its properties in Sects. 5.1 and 5.2 shows that the problem of minimizing relevant information loss is related to the information bottleneck problem. With the help of relevant information loss we then justify PCA from an information-theoretic perspective in Sect. 5.3.
更多
查看译文
关键词
information,loss
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要