Universal Physical Camouflage Attacks on Object Detectors

arxiv(2020)

引用 174|浏览255
暂无评分
摘要
In this paper, we study physical adversarial attacks on object detectors in the wild. Previous works mostly craft instance-dependent perturbations only for rigid or planar objects. To this end, we propose to learn an adversarial pattern to effectively attack all instances belonging to the same object category, referred to as Universal Physical Camouflage Attack (UPC). Concretely, UPC crafts camouflage by jointly fooling the region proposal network, as well as misleading the classifier and the regressor to output errors. In order to make UPC effective for non-rigid or non-planar objects, we introduce a set of transformations for mimicking deformable properties. We additionally impose optimization constraint to make generated patterns look natural to human observers. To fairly evaluate the effectiveness of different physical-world attacks, we present the first standardized virtual database, AttackScenes, which simulates the real 3D world in a controllable and reproducible environment. Extensive experiments suggest the superiority of our proposed UPC compared with existing physical adversarial attackers not only in virtual environments (AttackScenes), but also in real-world physical environments.
更多
查看译文
关键词
optimization constraint,classifier,AttackScenes,standardized virtual database,deformable properties,universal physical camouflage attacks,universal physical camouflage attack,UPC crafts,physical adversarial attackers,real-world physical environments,physical-world attacks,nonplanar objects,region proposal network,object category,adversarial pattern,planar objects,rigid objects,instance-dependent perturbations,physical adversarial attacks,object detectors
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要