Brain signals and the possible future of Neural Computers
Brain-computer interfaces (BCIs) are emerging tools that could one day help people with brain or spinal cord injuries move or communicate. BCI systems rely on implantable sensors that record electrical signals in the brain and use these signals to control external devices such as computers or robotic prostheses.
Most current BCI systems use one or two sensors to scan several hundred neurons, but neuroscientists are interested in systems that can collect data from much larger groups of brain cells.
Now, a team of researchers is taking an important step toward a new concept for the future BCI system – a system that uses a coordinated network of independent, wireless, small-scale neural sensors the size of a grain of salt to record and stimulate brain activity. . The sensors, known as “neurograms,” independently record the electrical impulses generated by the firing of neurons and send signals wirelessly to the central turntable, which synchronizes and processes the signals. .
The brain neural system and the signals team
In a study published Aug. 12 in Nature Electronics, the team demonstrated the use of nearly 50 independent neuronal brains to record neural activity in rodents.
The results, according to the researchers, are a step towards a system that could one day record brain signals at unprecedented levels of detail and provide new insights into how the brain works and new treatments for people with brain damage or Are spinal lead. to the
“One of the big challenges in brain-computer interfaces is studying as many points in the brain as possible,” says Arto Normico, a professor at Brown School of Engineering and lead author of the study. “So far, most BCIs have been integrated devices – a bit like small needle beds. Our team ‘s idea was to break this unit down into smaller sensors that could be distributed in the cerebral cortex. We succeeded.” Demonstrate. “
The team, which includes experts from Brown, Baylor University, the University of California, San Diego and Qualcomm, began developing the system about four years ago. “It was a dual challenge,” said Normico, a member of the Brown’s Carney Institute for Brain Science. The first part involved downsizing the complex electronic devices needed to detect, amplify, and transmit neural signals in small neurogreen silicon chips. The team first designed and simulated electronic components on a computer and carried out several production steps to develop application chips.
The second challenge was the development of a body-external communication node that received signals from these small chips. The device is a thin piece, about the size of a thumb, attached to the scalp outside the skull. It acts like a miniature cell tower that uses the network protocol to synchronize Neurograins signals, each with its own network address. The patch also wirelessly supplies Neurograins power, which is designed to work with minimal power.
How can this improve brain functionality?
People with serious injuries or nervous system disorders sometimes lose the ability to perform routine tasks such as walking, composing, or driving. You can imagine that you are doing something, but harm can prevent this.
There are brain-computer interface systems that can turn brain signals into a desired function for performance recovery, but using them can be a heavy burden because it does not always work smoothly and requires adjustment to complete even simple tasks. Has again.
Researchers at the University of Pittsburgh and Carnegie Mellon University are trying to use brain-computer interface technology to understand how the brain works when learning tasks. In a series of papers, the second of which was published today in Nature Biomedical Engineering, the team is developing a needle in brain-computer interface technology designed to improve the lives of amputated patients using neural prostheses.
“Tell me you’re planning your evening trip to the grocery store during your work day,” says Aaron Batista, an assistant professor of bioengineering at Swanson School of Engineering in Pete. “The program is stable somewhere in your brain during the day, but it probably won’t reach your motor cortex unless you’re really in business. We’re developing brain-computer interface technologies that we hope will one day live up to our daily goals.” To be placed.
Batista, Pete’s postdoctoral fellow, Emily Obi, and Carnegie Mellon researchers worked together to develop direct pathways from the brain to external devices. They use electrodes smaller than hair, which record neural activity and make it available to control algorithms.
In the team’s first study, published last June in the Proceedings of the National Academy of Sciences, the team looked at how the brain changes by learning new brain-computer interface skills.
“When the subjects intend to move, they create activity patterns through these electrodes, and we distribute them in the form of movements on a computer screen. Then people change their neural activity patterns in such a way that the movements It motivates them. ” Project Manager Steven Chase, Professor of Biomedical Engineering at the Carnegie Mellon Institute for Neuroscience.
In the new study, the team developed technology that continuously tunes the brain-computer interface into the background to ensure the system is always calibrated and ready to use.
Can this manipulate the brain?
“We’re changing the way neural activity affects cursor movement, and that creates learning,” said Obi-von Pitt, lead author of the study. If we change this relationship in a certain way, our animal people will have to develop new patterns of neural activity so that they can control the movement of the cursor again. It took them weeks to do this, and we saw how the brain “changed how they learned.”
In a sense, the algorithm “learns” to adapt to the noise and inherent instability of neural recording interfaces. The results show that the process of mastering new human skills involves creating new patterns of neural activity. The team wants the technology to eventually be used in a clinical setting for stroke rehabilitation.
Such reperfusion techniques have been a long-term goal in the field of neuroprosthesis, and the method presented in the team studies is able to automatically recover from instabilities without the user having to rest from their system to rebalance.
“Suppose the instability was so great that the subject could no longer control the interface between the brain and the computer,” Yu said. In this scenario, the existing methods are likely to be difficult to calibrate themselves, while our method has shown that in many cases even the most dramatic instabilities are improved. “
Both research projects have been conducted as part of the Neurological Cognition Center. This inter-organizational research and training program uses Pete’s strengths in basic and clinical neuroscience and bioengineering with Carnegie Mellon in cognitive and computational neuroscience.
Carnegie Mellon’s other collaborators on the projects include Byron Yu, director of electrical and computer engineering and medical engineering, and director, and Alan Degnart and William Bishop, research directors.
“It was a real multidisciplinary challenge,” said Jeyhun Lee, Brown’s postdoctoral fellow and lead author of the study. “We need to bring together specialists in electromagnetism, radio frequency communications, circuit design, manufacturing and neuroscience to develop and operate the Neurograin system.”
The purpose of this new study was to show that this system can record neural signals from the living brain – in this case, the rodent brain. The team placed 48 nerve endings on the animal’s cerebral cortex, the outer layer of the brain, and successfully recorded neural signals characteristic of spontaneous brain activity.
The team also tested the devices’ ability to stimulate and record the brain. Stimulation is done with small electrical impulses that can activate neural activity. Researchers hope the stimulus will come from the same nutritional center that coordinates nerve recordings and could one day restore brain function that has been lost to disease or injury.
How was the study carried out?
The team limited the size of the animal’s brain to 48 nerve grains for the study, but the data show that the system’s current configuration can support up to 770 currently unattainable levels of brain activity.
“It was challenging because the system requires the simultaneous transmission of electricity and the wireless network at megabits per second, and this has to be done in a very limited range of silicon and power constraints,” said Vincent Leung, an associate professor at the School of Electrical Engineering. Engineering and IT at Baylor Our team has pushed the boundaries of distributed nerve implantation. “
Much remains to be done to realize this complete system, but researchers say this study is an important step in that direction.
“Our hope is that we will eventually be able to develop a system that provides new scientific knowledge about the brain and new therapies that can help people who have suffered serious injuries,” says Nomico.
Adapted from: Science daily.