Accurate yet Efficient Stochastic Computing Neural Acceleration with High Precision Residual Fusion

2023 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION, DATE(2023)

引用 2|浏览12
暂无评分
摘要
Stochastic computing (SC) emerges as a fault-tolerant and area-efficient computing paradigm for neural acceleration. However, existing SC accelerators suffer from an intrinsic trade-off between inference accuracy and efficiency: accurate SC requires high precision computation but suffers from an exponential increase of bitstream length and inference latency. In this paper, we discover the high precision residual as a key remedy and propose to combine a low precision datapath with a high precision residual to improve inference accuracy with minimum efficiency overhead. We also propose to fuse batch normalization with the activation function to further improve the inference efficiency. The effectiveness of our proposed method is verified on a recently proposed SC accelerator. With extensive results, we show that our proposed SC-friendly network achieves 9.43% accuracy improvements compared to the baseline low precision networks with only 1.3% area-delay product (ADP) increase. We further show 3.01x ADP reduction compared to the baseline SC accelerator with almost iso-accuracy.
更多
查看译文
关键词
Stochastic computing,neural acceleration,low precision datapath,high precision residual
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要