Abnormality detection based on the Kullback–Leibler divergence for generalized Gaussian data

Control Engineering Practice(2019)

引用 16|浏览52
暂无评分
摘要
This paper is on abnormality detection, where the observed data under the normal condition is assumed to be independent and identically distributed (i.i.d.) and follow the generalized Gaussian distribution (GGD) with shape parameter greater than 1. The Kullback–Leibler divergence (KLD) between the estimated GGD of the observed data and the normal one is used as the test statistic. An analytical expression of the KLD is derived under the normal condition when the number of samples is large; then, two algorithms with constant and adaptive thresholds are proposed. Extensive simulated and industrial case studies are conducted to verify the analytical results and to show the effectiveness of the proposed algorithms.
更多
查看译文
关键词
Abnormality detection,Kullback–Leibler divergence,Generalized Gaussian distribution,Constant threshold,Adaptive threshold
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要