OSIP: Tightened Bound Propagation for the Verification of ReLU Neural Networks

SOFTWARE ENGINEERING AND FORMAL METHODS (SEFM 2021)(2021)

引用 2|浏览6
暂无评分
摘要
The Abstraction-based methods for the verification of ReLU-based neural networks suffer from rapid degradation in their effectiveness as the neural network's depth increases. We propose OSIP, an abstraction method based on symbolic interval propagation in which the choice of the ReLU relaxation at each node is determined via optimisation. We present an implementation of OSIP on top of Venus, a publicly available toolkit for complete verification of neural networks. In the experiments reported, OSIP calculated bounds that were tighter than the state-of-the-art on ReLU networks from the first competition for neural network verification. As a case study we apply the method for the verification of VGG16, a deep, high-dimensional, 300,000 node-strong model used for object classification in autonomous vehicles against local robustness properties. We demonstrate that OSIP could verify the correctness of the model against perturbations that are larger than what can be analysed with the present state-of-the-art.
更多
查看译文
关键词
bound propagation,neural networks,verification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要