Stacked Broad Learning System: From Incremental Flatted Structure to Deep Model

IEEE Transactions on Systems, Man, and Cybernetics: Systems(2021)

引用 63|浏览41
暂无评分
摘要
The broad learning system (BLS) has been proved to be effective and efficient lately. In this article, several deep variants of BLS are reviewed, and a new adaptive incremental structure, Stacked BLS, is proposed. The proposed model is a novel incremental stacking of BLS. This invariant inherits the efficiency and effectiveness of BLS that the structure and weights of lower layers of BLS are fixed when the new blocks are added. The incremental stacking algorithm computes not only the connection weights between the newly stacking blocks but also the connection weights of the enhancement nodes within the BLS block. The Stacked BLS is considered as the increment of “layers” and “neurons” dynamically during the training for multilayer neural networks. The proposed architecture along with the training algorithms that utilizes the residual characteristic is very versatile in comparison with traditional fixed architecture. Finally, experimental results on UCI datasets, MNIST dataset, NORB dataset, CIFAR-10 dataset, SVHN dataset, and CIFAR-100 dataset indicate that the proposed method outperforms the selected state-of-the-art methods on both accuracy and training speed, such as deep residual networks. The results also imply that the proposed structure could highly reduce the number of nodes and the training time of the original BLS in the classification task of some datasets.
更多
查看译文
关键词
Broad learning system (BLS),deep learning,functional link neural networks,nonlinear function approximation,universal approximation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要