Feedback alignment with weight normalization can provide a biologically plausible mechanism for learning

Alireza RahmanSetayesh,Ali Ghazizadeh,Farokh Marvasti

biorxiv(2021)

引用 0|浏览12
暂无评分
摘要
The mechanism by which plasticity in millions of synapses in the brain is orchestrated to achieve behavioral and cognitive goals is a fundamental question in neuroscience. In this regard, insights from learning methods in artificial neural networks (ANNs) and in particular the idea of backpropagation (BP) seem inspiring. However, the implementation of BP requires exact matching of forward and backward weights, which is unrealistic given the known connectivity pattern in the brain (known as ”weight transport problem”). Notably, it is recently shown that under certain conditions, error BackPropagation Through Random backward Weights (BP-TRW), can lead to partial alignment of forward and backward weights overtime (feedback alignment or FA) and result in surprisingly good accuracies in simple classification tasks using shallow ANNs. In this work, we took a closer look at FA to find out why it occurs when using BP-TRW and explored ways to boost it for deep ANNs. We first show that the gradual alignment of forward and backward weights arises from the successive application of BP-TRW update rule on forward weights regardless of learning or loss function if error signals and outputs of neurons satisfy certain conditions such as when they are autocorrelated. Moreover, we show that FA in deeper networks can be improved significantly by applying a biologically-inspired weight normalization (WN) to the input weights of each neuron. In addition, WN can improve the performance of both BP and BP-TRW when class labels are changed across time, an under-explored phenomenon in ANNs which is crucial for flexible learning in the brain in everyday life. With WN, BP-TRW test accuracy can almost match that of BP following class label changes. Altogether, our results portray a clearer picture of the FA mechanism and provide evidence for how learning can occur using BP-like mechanisms while abiding by biological limits on synaptic weights. ### Competing Interest Statement The authors have declared no competing interest.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要