Decoder Reduction Approximation Scheme for Booth Multipliers

Muhammad Hamis Haider,Hao Zhang,Seok-Bum Ko

IEEE TRANSACTIONS ON COMPUTERS(2024)

引用 0|浏览0
暂无评分
摘要
Existing approximate Booth multipliers fail to keep up with modern approximate multipliers such as truncation-based approximate logarithmic multipliers. This paper introduces a new approximation scheme for Booth multipliers that can operate with negligible error rates using only $N/4$N/4 Booth decoders, instead of the traditional $N/2$N/2 Booth decoders. The proposed 16-bit BD16.4 approximate Booth multiplier reduces the Normalized Mean Error Deviation (NMED) by 96.5% and the Power-Area-Product (PAP) by 69.6%, when compared to a state-of-the-art approximate logarithmic multiplier. Additionally, the proposed BD16.4 approximate multiplier reduces the NMED by 94.4% and PAP by 74.8%, when compared to a state-of-the-art higher-radix approximate Booth multiplier. The proposed 8-bit approximate Booth multipliers reduce the NMED by up to 74% and PAP by up to 5% when compared to the existing state-of-the-art approximate logarithmic multipliers. We validated the results derived in this paper through a neural network inference experiment, where the proposed approximate multipliers showed a negligible drop in inference accuracy compared to the exact Booth multipliers and the state-of-the-art approximate logarithmic multipliers (ALM). The proposed approximate multipliers achieved a Power-Delay-Product reduction of 63% (vs. exact) and 21.22% (vs. ALM) in 16-bit experiments and a reduction of 67% (vs. exact) and 8.75% (vs. ALM) in 8-bit experiments.
更多
查看译文
关键词
Booth multipliers,approximate computing,convolutional neural networks,logarithmic multipliers,leading one detection
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要