New technology could help decode what paralyzed people want to say
09-27-2018

New technology could help decode what paralyzed people want to say

Scientists are developing new technology to help individuals with paralyzed muscles communicate intuitively. A study from Northwestern University has uncovered new details about how the brain encodes speech, which will bring the speech technology one step closer to becoming a reality.

A brain machine interface (BMI) is a device that decodes neural information that the brain is sending to the tongue, lips, and larynx. The BMI will ultimately be used to translate an individual’s words as they attempt to speak.

The scientists discovered that the brain controls speech production in a similar manner to how it controls the production of arm and hand movements.

The researchers recorded signals from two parts of the brain and decoded what these signals represented. They found that the brain represents what we are trying to say as well as the individual movements that we use to talk, such as lip and tongue movements. These aspects of speech are represented in two different parts of the brain.

Study lead author Dr. Marc Slutzky is an associate professor of Neurology and of Physiology at Northwestern University Feinberg School of Medicine.

“This can help us build better speech decoders for BMIs, which will move us closer to our goal of helping people that are locked-in speak again,” said Dr. Slutzky.

The discovery could also potentially help people with other speech disorders, such as apraxia of speech, which is often seen in children and stroke patients. When suffering from this condition, individuals have difficulty translating messages from the brain into spoken language.

“We studied two parts of the brain that help to produce speech,” said Dr. Slutzky. “The precentral cortex represented gestures to a greater extent than phonemes. The inferior frontal cortex, which is a higher level speech area, represented both phonemes and gestures.”
The next step for the researchers is to develop an algorithm for brain machine interfaces that not only translates gestures, but also combines them to form words.

The study is published in the Journal of Neuroscience.

By Chrissy Sexton, Earth.com Staff Writer

News coming your way
The biggest news about our planet delivered to you each day
Subscribe