This website requires JavaScript.

Activation Learning by Local Competitions

Hongchao Zhou
Sep 2022
摘要
The backpropagation that drives the success of deep learning is most likelydifferent from the learning mechanism of the brain. In this paper, we develop abiology-inspired learning rule that discovers features by local competitionsamong neurons, following the idea of Hebb's famous proposal. It is demonstratedthat the unsupervised features learned by this local learning rule can serve asa pre-training model to improve the performance of some supervised learningtasks. More importantly, this local learning rule enables us to build a newlearning paradigm very different from the backpropagation, named activationlearning, where the output activation of the neural network roughly measureshow probable the input patterns are. The activation learning is capable oflearning plentiful local features from few shots of input patterns, anddemonstrates significantly better performances than the backpropagationalgorithm when the number of training samples is relatively small. Thislearning paradigm unifies unsupervised learning, supervised learning andgenerative models, and is also more secure against adversarial attack, paving aroad to some possibilities of creating general-task neural networks.
展开全部
图表提取

暂无人提供速读十问回答

论文十问由沈向洋博士提出,鼓励大家带着这十个问题去阅读论文,用有用的信息构建认知模型。写出自己的十问回答,还有机会在当前页面展示哦。

Q1论文试图解决什么问题?
Q2这是否是一个新的问题?
Q3这篇文章要验证一个什么科学假设?
0
被引用
笔记
问答