Out-Of-Distribution Detection For Deep Neural Networks With Isolation Forest And Local Outlier Factor

IEEE ACCESS(2021)

引用 14|浏览22
暂无评分
摘要
Deep Neural Networks (DNNs) are extensively deployed in today's safety-critical autonomous systems thanks to their excellent performance. However, they are known to make mistakes unpredictably, e.g., a DNN may misclassify an object if it is used for perception, or issue unsafe control commands if it is used for planning and control. One common cause for such unpredictable mistakes is Out-of-Distribution (OOD) input samples, i.e., samples that fall outside of the distribution of the training dataset. We present a framework for OOD detection based on outlier detection in one or more hidden layers of a DNN with a runtime monitor based on either Isolation Forest (IF) or Local Outlier Factor (LOF). Performance evaluation indicates that LOF is a promising method in terms of both the Machine Learning metrics of precision, recall, F1 score and accuracy, as well as computational efficiency during testing.
更多
查看译文
关键词
Runtime, Monitoring, Uncertainty, Training, Safety, Feature extraction, Neurons, Out-of-distribution, deep neural networks, runtime monitoring, outlier detection, isolation forest, local outlier factor
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要