Page 24 - Rappaport Institute Magazine 2024
P. 24

   BRAIN SCIENCE
Omri Barak, PhD
Associate professor of Theoretical neuroscience
PhD, 2009 – Weizmann Institute of Science, Israel
Learning from systems that learn
The word “learning” often conjures images of school or other human endeavors. In recent years, the rise of artificial intelligence led to associate this word with technical terms from computer science. What do we gain from using the same name for both?
In my research, I explore the hypothesis that systems that learn can be useful models of one another. This is because of general principles that seem to transcend specific instances, such as multiplicity of solutions, low-rank perturbations and more.
My main object of study is the brain, and I use tools from the fields of Mathematics and Physics, along with analysis of experimental data from collaborating laboratories, to understand how the dynamics of large neural networks support cognitive operations such as learning and memory. Beyond the brain, I view learning as a much more general phenomenon – and perhaps a key for understanding biology. To this end, I analyze how artificial systems learn, and also consider processes such as cancer as learning phenomena.
Selected Publications
ˆ F Schuessler, F Mastrogiuseppe, A Dubreuil, S Ostojic, O Barak. The interplay between randomness and structure during learning in RNNs. NeurIPS 2020.
ˆ Sussillo D*, Barak O*: Opening the Black Box: Low-dimensional dynamics in high- dimensional recurrent neural networks. Neural Comp. 2013
ˆ Rivkind A, Barak O: Local Dynamics in Trained Recurrent Neural Networks. PRL 2017 ˆ Barak O: Recurrent neural networks as versatile tools of neuroscience research. Curr. Opin. Neurobiol. 2017
Grants and Awards
2016 – 2020 Israeli Science Foundation
2017 – Yanai commendation for excellence in teaching
2017 – Sir Bernard Katz prize
2021 – 2024 Human Science Frontiers Program. (with Nathan Keim & Mathew Diamond) 2021 – 2025 Israeli Science Foundation
Collaborators
David Sussillo, Stanford
Srdjan Ostojic, ENS
Jakob Macke, Tubingen
omrib@technion.ac.il
Omri Barak Lab
   Reverse engineering a trained recurrent neural network performing a 3-bit flip-flop task.
(Left) The eight memory states are discovered to be stable fixed points (black), with saddle points (green) between them, mediating transitions. (Right) Demonstration that a saddle point mediates the transitions between attractors. The input for a transition (circled region in left panel) was varied until it crosses the separatrix and converges to the second memory state.










































































   22   23   24   25   26