UC Davis Health’s innovative brain-computer interface has successfully enabled a man with ALS to communicate with up to 97% accuracy, marking a significant breakthrough in assisting those with severe speech impairments.
UC Davis Health has unveiled a groundbreaking brain-computer interface (BCI) that can translate brain signals into speech with a record accuracy rate of up to 97%. This pioneering technology aims to restore communication for individuals with severe speech impairments, such as those caused by amyotrophic lateral sclerosis (ALS).
“Our BCI technology helped a man with paralysis to communicate with friends, families and caregivers,” David Brandman, an assistant professor in the UC Davis Department of Neurological Surgery and co-director of the UC Davis Neuroprosthetics Lab who co-led the study, said in a news release. “Our paper demonstrates the most accurate speech neuroprosthesis ever reported.”
ALS, commonly known as Lou Gehrig’s disease, deteriorates the nerve cells responsible for controlling muscle movements, leading to a loss of the ability to move and speak. The new BCI technology is designed to decode brain signals and convert them into text, which a computer then “speaks” aloud.
In a remarkable case, 45-year-old Casey Harrell, who suffers from ALS, regained his ability to communicate after the implantation of the BCI device. Harrell, who had trouble moving and speaking, was enrolled in the BrainGate clinical trial to test the device.
In July 2023, Brandman implanted the BCI by placing four microelectrode arrays into Harrell’s brain, specifically targeting the region responsible for speech coordination.
“We’re really detecting their attempt to move their muscles and talk,” Sergey Stavisky, an assistant professor in the UC Davis Department of Neurological Surgery and co-director of the UC Davis Neuroprosthetics Lab who co-led the study, said in the news release. “We are recording from the part of the brain that’s trying to send these commands to the muscles. And we are … translating those patterns of brain activity into a phoneme — like a syllable or the unit of speech — and then the words they’re trying to say.”
Harrell participated in both prompted and spontaneous conversational settings, where the system consistently translated his brain activity into readable text. Remarkably, the system also used pre-ALS audio samples to replicate Harrell’s natural voice. During an initial training session, it achieved an astonishing 99.6% word accuracy with a limited vocabulary, which later expanded to 125,000 words with 90.2% accuracy in just over an hour of training.
“The first time we tried the system, he cried with joy as the words he was trying to say correctly appeared on-screen. We all did,” Stavisky added.
After further training, the BCI maintained a 97.5% accuracy rate and has proven more effective than many commercial voice-recognition applications. Harrell has used the BCI system to engage in self-paced conversations for over 248 hours, both in person and via video chat.
“Not being able to communicate is so frustrating and demoralizing. It is like you are trapped,” Harrell said. “Something like this technology will help people back into life and society.”
Nicholas Card, a postdoctoral fellow in the UC Davis Neuroprosthetics Lab and the study’s lead author, praised the transformative impact of the technology.
“It has been immensely rewarding to see Casey regain his ability to speak with his family and friends through this technology,” he said.
The study, published in the New England Journal of Medicine, reports on 84 data collection sessions over 32 weeks. Funded and supported by various institutions, this research is part of the ongoing BrainGate2 clinical trial, which continues to recruit participants.