Mismatched estimation and relative entropy in vector Gaussian channels

ISIT(2013)

引用 8|浏览13
暂无评分
摘要
We derive a novel relation between mismatched estimation and relative entropy (KL divergence) in vector Gaussian channels under the mean squared estimation criterion. This relation includes as special cases several previous results connecting estimation theory and information theory. A direct proof is provided, together with a verification using Gaussian inputs. An interesting relationship between the KL divergence and Fisher divergence is derived as a direct consequence of our work. The relations established here are potentially useful for inference in graphical models and the design of information systems.
更多
查看译文
关键词
kl divergence,gaussian channels,relative entropy,information system,mismatched estimation,graphical model,fisher divergence,estimation theory,least mean squares methods,information theory,vector gaussian channel,mean squared estimation,mathematical model,entropy,mutual information,vectors,estimation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要