谷歌浏览器插件
订阅小程序
在清言上使用

Quantization Effects on Neural Networks Perception: How Would Quantization Change the Perceptual Field of Vision Models?

CoRR(2024)

引用 0|浏览20
暂无评分
摘要
Neural network quantization is a critical technique for deploying models on resource-limited devices. Despite its widespread use, the impact of quantization on model perceptual fields, particularly in relation to class activation maps (CAMs), remains underexplored. This study investigates how quantization influences the spatial recognition abilities of vision models by examining the alignment between CAMs and visual salient objects maps across various architectures. Utilizing a dataset of 10,000 images from ImageNet, we conduct a comprehensive evaluation of six diverse CNN architectures: VGG16, ResNet50, EfficientNet, MobileNet, SqueezeNet, and DenseNet. Through the systematic application of quantization techniques, we identify subtle changes in CAMs and their alignment with Salient object maps. Our results demonstrate the differing sensitivities of these architectures to quantization and highlight its implications for model performance and interpretability in real-world applications. This work primarily contributes to a deeper understanding of neural network quantization, offering insights essential for deploying efficient and interpretable models in practical settings.
更多
查看译文
关键词
Perceptual Learning,Visual Perception,Neural Processing,Neuronal Adaptation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要