A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks

Neural Comput. 2008 Dec;20(12):2937-66. doi: 10.1162/neco.2008.05-07-530.

Abstract

We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule, including passive forgetting and different timescales, for neuronal activity and learning dynamics. Previous numerical work has reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on neural network evolution. Furthermore, we show that sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Action Potentials / physiology
  • Animals
  • Entropy
  • Feedback
  • Humans
  • Learning / physiology*
  • Mathematics*
  • Nerve Net / physiology*
  • Neural Networks, Computer*
  • Neurons / physiology*
  • Nonlinear Dynamics*
  • Synapses / physiology
  • Time Factors