Quantized Reservoir Computing for Spectrum Sensing With Knowledge Distillation

IEEE Transactions on Cognitive and Developmental Systems(2023)

引用 1|浏览17
暂无评分
摘要
Quantization has been widely used to compress machine learning models for deployments on the field-programmable gate array (FPGA). However, quantization often degrades the accuracy of a model. In this work, we introduce a quantization approach to reduce the computation/storage resource consumption of a model without losing much accuracy. Spectrum sensing is a technique to identify the idle/busy bandwidths in cognitive radio. The spectrum occupancy of each bandwidth maintains a temporal correlation with previous and future time slots. A recurrent neural network (RNN) is very suitable for spectrum sensing. Reservoir computing (RC) is a computation framework derived from the theory of RNNs. It is a better choice than RNN for spectrum sensing on FPGA because it is easier to train and requires fewer computation resources. We apply our quantization approach to the RC to reduce the resource consumption on FPGA. A knowledge distillation (KD) called teacher-student mutual learning (TSML) is proposed for the quantized RC to minimize quantization errors. The TSML resolves the mismatched capacity issue of conventional KD and enables KD on small data sets. On the spectrum-sensing data set, the quantized RC trained with the TSML achieves comparable accuracy and reduces the resource utilization of digital signal processing (DSP) blocks, flip-flop (FF), and lookup table (LUT) by 53%, 40%, and 35%, respectively, compared to the RNN. The inference speed of the quantized RC is 2.4 times faster. The TSML improves the accuracy of the quantized RC by 2.39%, which is better than the conventional KD.
更多
查看译文
关键词
Cognitive radio,knowledge distillation (KD),model compression,quantization,reservoir computing (RC),spectrum sensing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要