Shlizerman's Research Group on
Data-driven Dynamical Systems
Data driven Dynamical Systems
   Research




Classification and Recognition from High Dimensional Network Dynamics

We develop methods at the interface of dynamical system theory and data analysis to classify the dynamics that networks produce. In neuronal networks these are collections of experimentally observed time-series recorded from multiple neurons as they respond to stimuli. We also develop optimal strategies for network sampling to obtain efficient classification. Examples of classifications from our research include olfactory decision space in insects from supervised recordings, and functional connectome for the C. elegans worm.


Predictive Computational Modeling of Neuronal Networks

Neuronal networks are capable to fuse sensory information into neural activities, which encode behaviors. Some of these behaviors are unique and robust, e.g., locomotion or directional flight. We thereby study how neural circuits are designed by modeling their sensory networks from building blocks (connectomics and neural dynamics) and investigate robustness, optimality and controllability of these systems. Example project: We have developed a neuronal model that integrates solar azimuthal position and signals that encode time of day and facilitates directional flight in migrating Monarch butterfly.




Network architecture from data

We introduce methods for inference of black-box networks’ wiring (with unknown connectivity map). Our tools link between reduction of time-series and reduction of models to produce optimization routines for connectivity calibration. We have recently inferred a prototype of the antennal lobe, primary olfactory processing unit in insects, from multi-neuron recordings, and are currently working on recovering the complete connectome of the antennal lobe, and coding it on a microchip. We also collaborate on the recovery of the connectome of other neural systems.




Neuromorphic computing

Neuronal networks are capable of processing specific data and tasks optimally and in ‘real time’. Many of these problems are computationally expensive to solve with current computing systems. We thereby develop algorithms and architectures inspired from the design principles of neurobiological networks to solve these problems more efficiently. Example projects: Parallel implementation of convolutional neural networks using light, Continuous training of recurrent networks.