Residue Number System-Based Solution for Reducing the Hardware Cost of a Convolutional Neural Network.
Neurocomputing(2020)
摘要
Convolutional neural networks (CNNs) represent deep learning architectures that are currently used in a wide range of applications, including computer vision, speech recognition, time series analysis in finance, and many others. At the same time, CNNs are very demanding in terms of the hardware and time cost of a computing system, which considerably restricts their practical use, e.g., in embedded systems, real-time systems, and mobile volatile devices. The goal of this paper is to reduce the resources required to build and operate CNNs. To achieve this goal, a CNN architecture based on Residue Number System (RNS) and the new Chinese Remainder Theorem with fractions is proposed. The new architecture gives an efficient solution to the main problem of RNSs associated with restoring the number from its residues, which determines the main contribution to the CNN structure. In accordance with the results of hardware simulation on Kintex7 xc7k70tfbg484-2 FPGA, the use of RNS in the convolutional layer of a neural network reduces hardware cost by 32.6% compared to the traditional approach based on the binary number system. In addition, the use of the proposed hardware-software architecture reduces the average image recognition time by 37.06% compared to the software implementation.
更多查看译文
关键词
Image recognition,Convolutional neural networks,Residue number system,Quantization noise,FPGA
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要