State-of-the-art optical-based physical adversarial attacks for deep learning computer vision systems

Expert Systems with Applications(2024)

引用 0|浏览75
暂无评分
摘要
Adversarial attacks can mislead deep learning models to make false predictions by implanting small perturbations to the original input that are imperceptible to the human eye, which poses a huge security threat to computer vision systems based on deep learning. Physical adversarial attacks, which is more realistic, as the perturbation is introduced to the input before it is captured and converted to a image inside the vision system, when compared to digital adversarial attacks. In this paper, we focus on physical adversarial attacks and further classify them into invasive and non-invasive. Optical-based physical adversarial attack techniques (e.g. using light irradiation) belong to the non-invasive category. The perturbations can be easily ignored by humans as the perturbations are very similar to the effects generated by a natural environment in the real world. With high invisibility and executability, optical-based physical adversarial attacks can pose a significant or even lethal threat to real systems. This paper focuses on optical-based physical adversarial attack techniques for computer vision systems, with emphasis on the introduction and discussion of optical-based physical adversarial attack techniques.
更多
查看译文
关键词
Adversarial attacks,Deep learning,Security threat,Saliency detection,Optical-based physical adversarial attack
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要