Pathwise concentration bounds for Bayesian beliefs

THEORETICAL ECONOMICS(2023)

引用 7|浏览1
暂无评分
摘要
We show that Bayesian posteriors concentrate on the outcome distributions that approximately minimize the Kullback-Leibler divergence from the empirical distribution, uniformly over sample paths, even when the prior does not have full support. This generalizes Diaconis and Freedman's (1990) uniform convergence result to, e.g., priors that have finite support, are constrained by independence assumptions, or have a parametric form that cannot match some probability distributions. The concentration result lets us provide a rate of convergence for Berk's (1966) result on the limiting behavior of posterior beliefs when the prior is misspecified. We provide a bound on approximation errors in "anticipated-utility" models, and extend our analysis to outcomes that are perceived to follow a Markov process.
更多
查看译文
关键词
Misspecified learning,Bayesian consistency,C11,D81
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要