Following papers:
- Learning to learn with backpropagation of Hebbian plasticity [2016]
- Differentiable plasticity : training plastic neural networks with backpropagation [2018]
- BackPropamine : training self-modifying neural networks with differentiable neuromodulated plasticity [2019]

Some definition from Artificial Evolution of Plastic Neural Network : a Few Key Concepts and NonSynaptic Plasticity :
Hebb's rule : neurons that fire together, wire together. (if a neuron repeatedly takes part
in making another neuron fire, the connection between them is strengthened)
Structural plasticity : the mechanism describing a generation of new connections and thereby redefining
the topology of the network.
Synaptic plasticity : the mechanism of changing strength values of existing connections.
Non-Synaptic Plasticity : modification of intrinsic excitability of the neuron.
Excitatory postSynaptic potentials (EPSPs) and Inhibitory postSynaptic potentials (IPSPs).
About the papers :
- Fact : usual Neural Network trained with backpropagation have fixed connections weights that do not change once the training is complete.
- Argument : in nature, the primary basis for long-term learning and memory in the brain is synaptic plasticity
- Challenge : optimizing through gradient descent not only the base weights, but also the amount of plasticity in each connection.
- Result : fixed parameters obtained but describing how to change each connection over time.
In [2016] paper they propose a time-dependent quantity for each connection in the network, called the Hebbian trace :
Hebbk(t) = (1 - γ) * Hebbk(t-1) + γ * xk(t) * y(t)
where y(t) is the activity of the post-synaptic cell, xk(t) is the
activity of the pre-synaptic cell, and γ is a time constant.
So the response of a given cell can be written with a fixed component (classic weights) and a plastic one :
y(t) = tanh(Σk wkxk(t) + αkHebbk(t)xk(t) + b)
here the plastic parameter αk is constant over time.
In [2018] paper, they rebaptised γ as η (call it the learning rate of plasticity) and change the Hebbian trace definition using Oja's rule.
Oja's rule is a modification of the standard Hebb's rule that allow, among others, to maintain stable weight values indefinitely in the absence of stimulation (thus allowing stable long-term memories) while still preventing runaway divergences.
The Hebbian trace definition become :
Hebbk(t+1) = Hebbk(t) + ηy(t)(xk(t-1) - y(t)Hebbk(t))
In fact, they change the notation as follow :
Hebbi,j(t+1) = Hebbi,j(t) + ηxj(t)(xi(t-1) - xj(t)Hebbi,j(t))
as, in the context of recurrent neural network, they put the plasticity component with the hidden state passed from previous step t-1 (xi(t-1)) and current input gate (xj(t)). And instead of speaking about the connection of a neuron k with the current looked neuron, they use the notation for representing the connection between neurons i and j.
The differentiable plasticity framework from [2016] facilitated the automatically weight changes as a function of pre- and post-synaptic activity. In [2019] paper, they propose to extend the framework with neuromodulated plasticity (ie allowed the network to actively modulate the connections plasticity).
First the Hebbian trace definition become :
Hebbi,j(t+1) = Clip(Hebbi,j(t) + ηxi(t-1)xj(t))
where the Clip function can be :
- a simple decay term (as in [2016] paper)
- a normalization implementing Oja's rule (as in [2018] paper)
- a simple hard clip (keeping value in [-1, 1], as proposed in this new paper)
Two types of neuromodulation is proposed :
- a simple one that replace the time-fixed parameter
ηby a time-varying scalarM(t)computed by the network. The Hebbian trace definition become :
Hebbi,j(t+1) = Clip(Hebbi,j(t) + M(t)xi(t-1)xj(t))
- a more sophisticated one that try to mimic the effect of dopamine on plasticity. Here, an Eligibility trace is defined as a simple exponential average of the Hebbian product of pre- and post-synaptic activity with trainable decay factor
η. Moreover the Hebbian trace is gated by the current dopamine signalM(t). That give use the following equations:
Hebbi,j(t+1) = Clip(Hebbi,j(t) + M(t)Ei,j(t))
Ei,j(t+1) = ηxi(t-1)xj(t) + (1 - η)Ei,j(t)
Site Map:
- Home Page -> Home Page
- Transformer -> Transformer
- Neural Plasticity -> Neural Plasticity
- Automatic Speech Recognition -> ASR
- Spiking Neural Network -> SNN
- Curriculum Vitae -> CV
- Contact page -> Contact