Learning smooth dendrite morphological neurons by stochastic gradient descent for pattern classification.

Neural networks : the official journal of the International Neural Network Society(2023)

引用 0|浏览5
暂无评分
摘要
This article presents a learning algorithm for dendrite morphological neurons (DMN) based on stochastic gradient descent (SGD). In particular, we focus on a DMN topology that comprises spherical dendrites, smooth maximum activation function nodes, and a softmax output layer, whose original learning algorithm is performed in two independent stages: (1) dendrites' centroids are learned by k-means, and (2) softmax layer weights are adjusted by gradient descent. A drawback of this learning method is that both stages are unplugged; once dendrites' centroids are defined, they keep static during weights learning, so no feedback is performed to correct the dendrites' positions to improve classification performance. To overcome this issue, we derive the delta rules for adjusting the dendrites' centroids and the output layer weights by minimizing the cross-entropy loss function under an SGD scheme. This gradient descent-based learning is feasible because the smooth maximum activation function that interfaces the dendrite units with the output layer is differentiable. The proposed DMN is compared against eight morphological neuron models with distinct topologies and learning methods and four well-established classifiers: support vector machine (SVM), multilayer perceptron (MLP), and random forest (RF), and k-nearest neighbors (k-NN). Besides, the classification performance is evaluated on 81 datasets. The experimental results show that the proposed method tends to outperform the DMN methods and is competitive or even better than SVM, MLP, RF, and k-NN. Thus, it is an alternative approach that can effectively be used for pattern classification. Moreover, SGD for DMN learning standardizes this neural model, like current artificial neural networks.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要