Computer Science Graduate Seminar
Thursday, March 11, 2021, 10:00am
Structural Plasticity as a Connectivity Generation and Optimization Algorithm in Neural Networks
- Sandra Diaz Pier, M.Sc. Eng.
- Zoom: https://rwth.zoom.us/j/92520089251?pwd=c1RwbUF4dzF4WWdDVWlBMDlHOFlZdz09
Meeting ID: 925 2008 9251
In many fields of science, models are based on sets of differential equations which need to be fit against experimental data. In order to do this, parameter spaces are searched to find specific values which make these models useful to answer relevant scientific questions. In computational neuroscience, models of spiking networks of neurons play an important role in understanding how the brain encodes information and achieves high level cognitive functions. These models are not only of interest for neuroscience but also to many other related fields including artificial intelligence, robotics and control. However, these models are very underconstrained, degenerate and show chaotic dynamics which makes it challenging to find suitable and robust solutions.
In this presentation I propose structural plasticity as an optimization algorithm inspired by neurobiology able to generate, modify and tune connectivity parameters for neural network models. Structural plasticity refers to the ability of neurons to change their structure by creating and deleting connections with other neurons in a network in order to preserve specific metabolic levels. First, I will introduce the characteristics of structural plasticity as an optimization algorithm together with details about its implementation in NEST, a well-known neural network simulator within the computational neuroscience community. This implementation can efficiently leverage computational resources and is applicable to large scale neural networks. I will also briefly present a tool which I have co-developed in order to visualize, analyze and interact with simulations using structural plasticity. I will focus on how this tool can be used to better understand the relationships between structure and function emerging in neural networks.
The rules under which structural plasticity operates in the brain have been tuned through centuries of natural evolutionary optimization. In the second part of my talk I present how meta-optimization can be used to artificially explore the general rules which make structural plasticity able to work with a variety of network configurations and reach different functional regimes at each portion of the network. Finally, I will present different applications for structural plasticity in and outside neuroscience as well as future research directions.
The computer science lecturers invite interested people to join.