Sparsity-depth Tradeoff in Infinitely Wide Deep Neural Networks

Chanwoo Chun,Daniel D. Lee

CoRR(2023)

引用 0|浏览1
暂无评分
摘要
We investigate how sparse neural activity affects the generalization performance of a deep Bayesian neural network at the large width limit. To this end, we derive a neural network Gaussian Process (NNGP) kernel with rectified linear unit (ReLU) activation and a predetermined fraction of active neurons. Using the NNGP kernel, we observe that the sparser networks outperform the non-sparse networks at shallow depths on a variety of datasets. We validate this observation by extending the existing theory on the generalization error of kernel-ridge regression.
更多
查看译文
关键词
infinitely,networks,sparsity-depth
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要