Channel pruning via attention module and memory curve

Hufei Li,Jian Cao,Xiangcheng Liu,Jue Chen, Jingjie Shang, Yu Qian,Yuan Wang

2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP(2023)

引用 0|浏览2
暂无评分
摘要
As an effective pruning method, dynamic pruning introduces gate modules that allow different input data to choose different channels. This shows that the choice of channels strongly depends on data. However, this creates an additional computational burden because of the gate modules. In this paper, we propose a simple, efficient and transferable channel pruning method via attention module and memory curve, dubbed as CPAM, which not only takes advantage of the strong correlation between data and channels, but also does not impose any additional computational burden on the model. Inspired by the memory curve, we use a progressive method without any sparse operation. Moreover, our method has been demonstrated effective for many advanced CNN architectures. Notably, on CIFAR-10, CPAM reduces 50% FLOPs on ResNet-56 with 0.31% relative accuracy improvement, which has advanced the state-of-the-art.
更多
查看译文
关键词
dynamic pruning,attention module,sparse operations,memory curve
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要