[ContinualAI Meetup] Neuroscience-Inspired Continual Learning

Are you interested in learning more about Continual Learning and its relationship with Neuroscience? This is the right online meetup for you! :point_down:

:date: The 26th of March at 6 PM (CET) at ContinualAI we will host our monthly online meetup on “Neuroscience-Inspired Continual Learning”! In this meeting we will discuss about key biological insights on what is believed to happen in the brain for learning continuously and how we can improve our artificial learning systems based on these observations.

Main speakers: Eric Laukien, Keiland W Cooper, Jeremy Forest and Subutai Ahmad!

:pushpin: To join the meeting: https://lnkd.in/dhukhri
:pushpin: To watch the live stream: https://lnkd.in/d25ZY_D
:pushpin: Meetup recording: https://www.youtube.com/watch?v=LPYBy4VHRuk

4 Likes

Thank you everyone for joining, particularly in these peculiar times!

1 Like

Link to my slides: https://www.kwcooper.xyz/slides today’s is #9

1 Like

Thanks to all the speakers for their interesting talks!

I have a follow-up question for the computational neuroscientists out there.

In Machine Learning we have standard models which we use all the time as building blocks of more complex architectures.
Based on your experience, there exist standard models also in computational neuroscience or do you usually start from scratch and build a custom model each and every time?

Thanks in advance!

1 Like

Yeah, I’d say there are a few. For instance, for spiking networks it’s good to know rate coded neurons, or Izhikevich neurons, or even Hodgkins/ Huxley neurons for some model “basic building blocks”
Hebbian networks and Hopfield networks are also commonly used.
Once you dig into the math of these, you’ll see that they aren’t far off from many deep learning architectures in some respects (many stem from the same early connectionist roots), but added to them are the biological properties of the model’s study. Ie. In the model in my talk today I took a hebbian network and added oscillations and neuromodulator modulation.

1 Like

The recording of the meetup is now available here: https://www.youtube.com/watch?v=LPYBy4VHRuk
I updated the description above! :slight_smile:

This is the chat transcript for further references:

00:21:13.777,00:21:16.777
Ayşin Taşdelen: Thank you for the presentation Keiland!

00:21:34.974,00:21:37.974
Subutai Ahmad: Thanks for the talk Keiland!

00:24:10.827,00:24:13.827
Admin ContinualAI: If you want to share your ideas about the topics of today meetup you can join the discussion here on discourse: [ContinualAI Online Meetup] Neuroscience-Inspired Continual Learning

00:24:57.819,00:25:00.819
Admin ContinualAI: For the speakers, please share your slides or links to external content there! :slight_smile:

00:41:34.320,00:41:37.320
Keiland Cooper: Thanks all. Link to my slides if it helps: https://www.kwcooper.xyz/slides today’s is #9. Great talk Subutai!

00:43:55.017,00:43:58.017
Ayşin Taşdelen: Thanks for the talk Subutai!

00:44:29.303,00:44:32.303
Subutai Ahmad: Thanks Keiland andAysin! I’ll put my slides up as well soon.

00:45:59.817,00:46:02.817
Admin ContinualAI: thanks Subutai! :slight_smile:

00:47:45.415,00:47:48.415
Subutai Ahmad: Here’s a link to my slides: https://drive.google.com/file/d/1wXpN67ocFKv1SCz1ppLozWCYJPR8vwuT/view?usp=sharing

01:00:24.602,01:00:27.602
Ayşin Taşdelen: Thank you for the talk Jeremy!

01:01:37.260,01:01:40.260
Jory Schossau: Thanks everyone!
Q Subutai (or anyone): With current technology and instrumentation, is there observable variation in sparsity between individual organisms within a population?

01:02:08.438,01:02:11.438
Keiland Cooper: Great talk Jeremy. Do share your slides if you can!

01:07:07.585,01:07:10.585
Lucas Souza: Thanks for the great talk, Jeremy. Can you please share the slides? Also, do you have a review paper or document that summarizes the different types of learning you discussed about, but in a little bit more detail?

01:07:25.035,01:07:28.035
Admin ContinualAI: Jory we will ask your question at the end of this talk! thanks!

01:09:32.223,01:09:35.223
Jeremy Forest: @Keiland @Lucas I’ll share the slide there at the end of the talk https://jeremyforest.github.io/papers/#talks

01:11:35.636,01:11:38.636
Jeremy Forest: @Lucas Hard to only give you 1 review paper on the multitude of domain I briefly talked about, but I can compile a short bibliography of the most important neurons and share that on slack

01:12:32.758,01:12:35.758
Michele Vitali: Unfortunately I have to leave now but I want to thank all the speakers for the amazing high level of the talks. They were all very interesting. Compliments. And a big thank you to Vincenzo for making this all possible!

01:12:55.204,01:12:58.204
Jeremy Forest: *of the most important paper

01:12:56.312,01:12:59.312
Admin ContinualAI: thanks you Michele for joining! :slight_smile:

01:16:09.613,01:16:12.613
Ayşin Taşdelen: Thank you for the talk Eric!

01:16:23.833,01:16:26.833
Subutai Ahmad: Thanks for the talk Eric!

01:16:25.943,01:16:28.943
Matthew Taylor: Nice to hear from you and Ogma Eric!

01:16:27.203,01:16:30.203
Keiland Cooper: Nice talk Eric

01:16:30.926,01:16:33.926
Ayşin Taşdelen: Vincenzo could we compile all the presentations on slack?

01:16:37.004,01:16:40.004
Matthew Taylor: Tell Fergal hi

01:16:48.888,01:16:51.888
Lucas Souza: That would be awesome and really helpful, Jeremy.

01:17:06.053,01:17:09.053
Eric Laukien: Thanks!

01:17:11.882,01:17:14.882
Mojtaba Faramarzi: Can yo share the youtube channel link here?

01:17:45.211,01:17:48.211
Ayşin Taşdelen: youtube channel : https://www.youtube.com/channel/UCD9_bqN3gX-TLxcr47vvMmA

01:17:50.972,01:17:53.972
Admin ContinualAI: ContinualAI youtube account: https://www.youtube.com/channel/UCD9_bqN3gX-TLxcr47vvMmA

01:17:51.104,01:17:54.104
Mojtaba Faramarzi: Thanks!

01:18:44.430,01:18:47.430
Ayşin Taşdelen: Thanks for the event.I have to leave now.

01:18:57.938,01:19:00.938
Admin ContinualAI: thanks Aysin for joining! :slight_smile:

01:19:23.865,01:19:26.865
David Murphy: Have to drop…grateful for the meetup. Thank you!

01:19:37.968,01:19:40.968
Admin ContinualAI: thanks David for joining as well!

01:20:06.886,01:20:09.886
Davide Maltoni: Thank you all for the very interesting presentations!

01:20:10.240,01:20:13.240
Jory Schossau: Thanks so much for that excellent response, Subutai.

01:20:19.505,01:20:22.505
Admin ContinualAI: thanks Davide, see you soon!

01:21:29.065,01:21:32.065
Krishna Priya S: Thank you all for the very interesting presentations!

01:30:04.824,01:30:07.824
Jory Schossau: next hurdle: short-term memory

01:33:16.076,01:33:19.076
Lucas Souza: Thanks for organizing, Vincenzo, amazing meetup!

01:33:18.788,01:33:21.788
Eric Laukien: Thank you everyone, was fun and informative!

01:33:19.470,01:33:22.470
Jory Schossau: Thanks for organizing!

01:33:20.512,01:33:23.512
Marc-André Piché: Thank you

01:33:35.529,01:33:38.529
Davide Maltoni: Thank you Vincenzo!

01:33:42.149,01:33:45.149
Lorenzo Pellegrini: Thanks everyone!