Deep-Decode

Project Description

In order to build closed-loop devices that read the status of a certain brain area to steer a prosthetic limb and provide feedback information from a sensor via stimulation, it is necessary to learn about the coding mechanism in the involved brain areas. There are currently many technical limitations that hamper the implementation of such devices. The first is due to the recording of neural signals, which is either relatively unspecific (e.g. EEG signals), or captures only signals of very few neurons (electrophysiology).

The same problem applies to the techniques available for electrical or optogenetic stimulation. To overcome the challenge of recording signals from only one or two brain areas, we will develop and customize a multisite recording and stimulation device for rats. The third limitation, which is to be approached in this project, is the lacking understanding of the relationship between the neural code in the brain, e.g. motor cortex and other areas involved in motor control, and the corresponding action at the limb (decoding), as well as how certain sensory inputs cause neural activity in the brain, e.g. somatosensory cortex or other brain areas relevant for tactile perception.

In this project we will collect well-controlled data, i.e., electrophysiological recordings synchronized with detailed positioning information of body parts collected with a newly developed behavioral setup. Further, we will develop ways to predict body motion from neural recordings (decoding) and vice-versa (encoding) using latest deep learning methods. This will give us the tools for analyzing the codes used by the brain to trigger certain actions, give indications from which exact brain areas it is best to record and where it is best to stimulate, and for potentially controlling a neuroprosthetic device.

Collaboration

Skip to content