Modeling Hebb Learning Rule for Unsupervised Learning
Modeling Hebb Learning Rule for Unsupervised Learning
Jia Liu, Maoguo Gong, Qiguang Miao
Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 2315-2321.
https://doi.org/10.24963/ijcai.2017/322
This paper presents to model the Hebb learning rule and proposes a neuron learning machine (NLM). Hebb learning rule describes the plasticity of the connection between presynaptic and postsynaptic neurons and it is unsupervised itself. It formulates the updating gradient of the connecting weight in artificial neural networks. In this paper, we construct an objective function via modeling the Hebb rule. We make a hypothesis to simplify the model and introduce a correlation based constraint according to the hypothesis and stability of solutions. By analysis from the perspectives of maintaining abstract information and increasing the energy based probability of observed data, we find that this biologically inspired model has the capability of learning useful features. NLM can also be stacked to learn hierarchical features and reformulated into convolutional version to extract features from 2-dimensional data. Experiments on single-layer and deep networks demonstrate the effectiveness of NLM in unsupervised feature learning.
Keywords:
Machine Learning: Machine Learning
Machine Learning: Neural Networks
Machine Learning: Unsupervised Learning
Machine Learning: Deep Learning