谷歌浏览器插件
订阅小程序
在清言上使用

Fuzzy Neural Networks Stability in Terms of the Number of Hidden Layers

Computational Intelligence and Informatics(2011)

引用 5|浏览0
暂无评分
摘要
This paper introduces an approach for studying the stability, and generalization capability of one and two hidden layer Fuzzy Flip-Flop based Neural Networks (FNNs) with various fuzzy operators. By employing fuzzy flip-flop neurons as sigmoid function generators, novel function approximators are established that also avoid overfitting in the case of test data containing noisy items in the form of outliers. It is shown, by comparing with existing standard tansig function based approaches that reducing the network complexity networks with comparable stability are obtained. Finally, examples are given to illustrate the effect of the hidden layer number of neural networks.
更多
查看译文
关键词
computational complexity,function approximation,function generators,fuzzy neural nets,stability,function approximators,fuzzy flip-flop neurons,fuzzy neural networks stability,fuzzy operators,generalization capability,hidden layer fuzzy flip-flop based neural networks,network complexity reduction,sigmoid function generators,tansig function
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要