View in #questions-forum on Slack
@Manas_Gupta: Hi all,
Can anyone point me to papers on Plasticity, specifically, on using Hebbian plasticity instead of backpropagation for training neural networks? So far, I have read some good papers from UberAI including-
- Born to Learn: the Inspiration, Progress, and Future of Evolved Plastic Artificial Neural Networks
- Differentiable plasticity: training plastic neural networks with backpropagation
- Backpropamine: training self-modifying neural networks with differentiable neuromodulated plasticity
Any suggestions would be welcome! Thanks!
@vlomonaco: @German_I.Parisi and @Qi(Roger)_She are experts on this! I’ll let them answer this 
In the manwhile I suggest this paper from @Rahaf with very interestinfg discussions about it: https://arxiv.org/abs/1711.09601
@Qi_(Roger)_She: @vlomonaco thx for the introducing. Hi @Manas_Gupta I would like to recommend one of recent work from Garrick Orchard about SLAYER, they introduce a new general backpropagation mechanism for learning synaptic weights and axonal delays which overcomes the problem of non-differentiability of the spike function, which in my mind is really cool for scaling plasticity algorithms (although it is the simple synaptic plasticity). Not sure whether your point 2 is the same work from Garrick.
https://papers.nips.cc/paper/7415-slayer-spike-layer-error-reassignment-in-time
SLAYER: Spike Layer Error Reassignment in Time
@German_I._Parisi: https://www.frontiersin.org/articles/10.3389/fnbot.2018.00078/full - Growing Hebbian networks with memory replay
Frontiers: Lifelong Learning of Spatiotemporal Representations With Dual-Memory Recurrent Self-Organization
@Manas_Gupta: Thanks all! The papers look very interesting. Will go through them all! 