Chrome Extension
WeChat Mini Program
Use on ChatGLM

SYNTHNET: A High-throughput Yet Energy-efficient Combinational Logic Neural Network

2022 27th Asia and South Pacific Design Automation Conference (ASP-DAC)(2022)

Cited 1|Views8
No score
Abstract
In combinational logic neural networks (CLNNs), neurons are realized as combinational logic circuits or look-up tables (LUTs). They make make extremely low-latency inference possible by performing the computation with pure hardware without loading weights from the memory. The high throughput, however, is powered by massively parallel logic circuits or LUTs and hence comes with high area occupancy and high energy consumption. We present SYNTHNET, a novel CLNN design method that effectively identifies and keeps only the sublogics that play a critical role in the accuracy and remove those which do not contribute to improving the accuracy. It captures the abundant redundancy in NNs that can be exploited only in CLNNs, and thereby dramatically reduces the energy consumption of CLNNs with minimal accuracy degradation. We prove the efficacy of SYNTHNET on the CIFAR-10 dataset, maintaining a competitive accuracy while successfully replacing layers of a VGG-style network which traditionally uses memory-based floating point operations with combinational logic. Experimental results suggest our design can reduce energy-consumption of CLNNs more than 90% compared to the state-of-the-art design.
More
Translated text
Key words
Energy consumption,Neurons,Redundancy,Artificial neural networks,Throughput,Energy efficiency,Systolic arrays
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined