Spike time displacement-based error backpropagation in convolutional spiking neural networks

arxiv(2023)

引用 0|浏览12
暂无评分
摘要
In this paper, we introduce a supervised learning algorithm, which avoids backward recursive gradient computation, for training deep convolutional spiking neural networks (SNNs) with single-spike-based temporal coding. The algorithm employs a linear approximation to compute the derivative of the spike latency with respect to the membrane potential, and it uses spiking neurons with piecewise linear postsynaptic potential to reduce the computational cost and the complexity of neural processing. To evaluate the performance of the proposed algorithm in deep architectures, we employ it in convolutional SNNs for the image classification task. For two popular benchmarks of MNIST and Fashion-MNIST datasets, the network reaches accuracies of, respectively, 99.2 and 92.8% . The trade-off between memory storage capacity and computational cost with accuracy is analyzed by applying two sets of weights: real-valued weights that are updated in the backward pass and their signs, binary weights, that are employed in the feedforward process. We evaluate the binary CSNN on two datasets of MNIST and Fashion-MNIST and obtain acceptable performance with a negligible accuracy drop with respect to real-valued weights (about 0.6 and 0.8% drops, respectively).
更多
查看译文
关键词
StiDi-BP algorithm,Convolutional spiking neural networks,Real-valued weights,Binary weights
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要