Scalable and Modular Robustness Analysis of Deep Neural Networks

PROGRAMMING LANGUAGES AND SYSTEMS, APLAS 2021(2021)

引用 1|浏览17
暂无评分
摘要
As neural networks are trained to be deeper and larger, the scalability of neural network analyzer is urgently required. The main technical insight of our method is modularly analyzing neural networks by segmenting a network into blocks and conduct the analysis for each block. In particular, we propose the network block summarization technique to capture the behaviors within a network block using a block summary and leverage the summary to speed up the analysis process. We instantiate our method in the context of a CPU-version of the state-of-the-art analyzer DeepPoly and name our system as Bounded-Block Poly (BBPoly). We evaluate BBPoly extensively on various experiment settings. The experimental result indicates that our method yields comparable precision as DeepPoly but runs faster and requires less computational resources. Especially, BBPoly can analyze really large neural networks like SkipNet or ResNet that contain up to one million neurons in less than around 1 hour per input image, while DeepPoly needs to spend even 40 hours to analyze one image.
更多
查看译文
关键词
Abstract interpretation, Formal verification, Neural nets
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要