As a critical care neurologist, Dr. Leigh Hochberg often sees patients who have acute, ongoing and even life-threatening illnesses or injuries to their nervous system.
Hochberg works to treat these ailments, but sometimes patients end up losing the ability to move or speak. This is the case for those with ALS, which is an incurable neurological disease that affects nerves in the brain and spinal cord.
“I would like nothing more than to be able to assure anybody with ALS [amyotrophic lateral sclerosis] that they will never lose the ability to communicate,” said Hochberg, who is also director of clinical trials for the BrainGate research collaborative and professor at Brown University’s School of Engineering and affiliated with the school’s Carney Institute for Brain Science.
Researchers got one step closer to accomplishing that goal earlier this year with a study that marked a breakthrough in helping people who have lost the ability to speak to communicate.
Through the study, which was conducted at Stanford University and published Aug. 23, Hochberg and other researchers affiliated with Brown, BrainGate and other institutions across the U.S. found that it is possible to use a participant’s neural activity to understand their speaking movements faster and with a wider vocabulary than before.
To accomplish this, researchers placed about 100 electrodes that were about the size of a baby aspirin in the motor cortex of a participant’s brain, said David Rosler, director of operations for the BrainGate consortium.
Even if the person is unable to move, the neurons in the motor cortex still modulate when the participant attempts to speak, Rosler says. Researchers then record this neuron activity to create an algorithm that they can use to decode what participants are trying to say and have it displayed on a computer screen.
Dr. Daniel Rubin, a member of the BrainGate research team and who teaches neurology at Harvard Medical School, added that the device is calibrated to each participant. This is because the human brain includes about 88 billion neurons and BrainGate’s device records just hundreds of these at a time. But even though researchers understand they are recording electrons related to the muscles for speech, they won’t know what each individual electron is responsible for until they start.
To ensure participants’ intended speech is understood correctly, they are asked to read specific phrases early in the process, Rubin says. Then, when the participant attempts to speak, they will confirm whether the device correctly decoded what they were trying to say in another way. Usually this is done with technology commonly used with ALS patients that tracks eye movements on a screen.
While the study’s results aren’t perfect, they were certainly promising.
Researchers reported that a participant who was no longer able to speak could generate 62 words per minute on a computer screen just by trying to speak. This is more than three times faster than the previous record for assisted communication with implanted brain-computer interfaces and closer to the rate of natural conversation speech among English-language speakers at 160 words per minute, according to the study.
When assessing the system’s accuracy, researchers found the error rate was 9.1% when the sentences and word model were limited to a 50-word vocabulary. But the error rate jumped to 23.8% when the vocabulary reached 125,000 words, which is enough to say anything someone could want.
“It’s quite incredible,” said Rosler, who is also a researcher with Brown and director of technology development and innovation at the Providence Veterans Affairs Medical Center’s Center for Neurorestoration and Neurotechnology. “It’s very meaningful to help provide people with the ability to communicate and engage with their loved ones more easily.”
The study marks a new milestone in BrainGate’s decades of research into brain-computer interfaces. Rubin says previous work has focused on studying a participant’s neural activity to help them regain movement in their hands, wrists and arms with the help of robotic devices. But more recently the focus has shifted to understanding the facial movements and decoding speech.
This work involved having participants think about the movements needed to move a computer mouse and click letters on a screen or even decoding their intended handwriting onto a screen, Rosler says.
The next step for making this kind of technology available is to ensure it gets approved by the Food and Drug Administration and to partner with a company that could bring the product to market, Rubin says. He wouldn’t be surprised to see it available in the next five years.
“Every year I look back at the past and I think, ‘Wow, I can’t believe how slow we were in getting certain things done,’ ” said Rubin, who is also a neurologist at Massachusetts General Hospital. “It’s just so much easier and faster now.”
As researchers continue to investigate, they say they are grateful to those who participate in their studies.
“The people who join us for these clinical trials are truly amazing,” Hochberg said, who along with his other positions is also director of the VA Center for Neurorestoration and Neurotechnology, which is a close collaborator of BrainGate’s research. “They join us not because they’re hoping to gain any personal benefit but because they want to help us to develop and test technologies that will help other people with ALS or other forms of paralysis.”