The study of plasticity has always been about gradients

J Physiol. 2023 Aug;601(15):3141-3149. doi: 10.1113/JP282747. Epub 2023 May 1.

Abstract

The experimental study of learning and plasticity has always been driven by an implicit question: how can physiological changes be adaptive and improve performance? For example, in Hebbian plasticity only synapses from presynaptic neurons that were active are changed, avoiding useless changes. Similarly, in dopamine-gated learning synapse changes depend on reward or lack thereof and do not change when everything is predictable. Within machine learning we can make the question of which changes are adaptive concrete: performance improves when changes correlate with the gradient of an objective function quantifying performance. This result is general for any system that improves through small changes. As such, physiology has always implicitly been seeking mechanisms that allow the brain to approximate gradients. Coming from this perspective we review the existing literature on plasticity-related mechanisms, and we show how these mechanisms relate to gradient estimation. We argue that gradients are a unifying idea to explain the many facets of neuronal plasticity.

Keywords: feedback connections; gradients; intrinsic plasticity; learning; machine learning; neuromodulators; synaptic plasticity.

Publication types

  • Review
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Brain
  • Dopamine
  • Neuronal Plasticity* / physiology
  • Neurons* / physiology
  • Synapses / physiology

Substances

  • Dopamine