Skip to main content

Johns Hopkins

Johns Hopkins Pediatric

Johns Hopkins Research Suggests New Promise for Patients with ALS

Courtesy of Crone Lab

A BCI decodes neural signals acquired from electrodes placed on the surface of brain areas responsible for speech and upper limb function. The participant then navigates options on a communication board using a set of six navigational commands to control devices like room lights. (Courtesy of Crone Lab)

Researching compelling new directions for implantable brain-computer interfaces, a team at Johns Hopkins is working with a study participant — a man gradually losing speech due to amyotrophic lateral sclerosis (ALS) — whose efforts may provide a wealth of knowledge to the field.

“I’m really in awe of our participant — that he’s so committed to doing this work,” says Johns Hopkins neurologist Nathan Earl Crone, who helps guide the patient through studies on brain-computer interfaces (BCIs), which allow people with impairments to communicate directly between their brain and external devices and computers. “He knows it will help future patients. So that’s very exciting.”

Nathan Crone

“Ultimately, we hope that our research will make brain-computer interfaces more mainstream in the industry, which will help many more patients.” –Nathan Crone

Crone, director of the Johns Hopkins University Cognitive Neurophysiology Lab and professor of neurology at the Johns Hopkins University School of Medicine, is conducting the research as part of the CortiCom clinical trial (NCT03567213), funded by the National Institutes of Health under its BRAIN Initiative. Crone’s work focuses on electrocorticographic, or ECoG, implants — electrodes surgically placed over the brain surface to record neural activity.

While BCI technology has evolved significantly in recent years — providing meaningful quality-of-life improvements for some people with severe disabilities or neurological conditions that limit speech and movement — many outstanding questions remain. One is whether decoding neural activity during speech commands can be used to help individuals directly control smart devices to perform activities, such as turning on lights in the home. Another is whether researchers can drastically cut time spent retraining and recalibrating the speech decoders, usually a necessary process for maintaining BCI stability.

In a new study published in Advanced Science, Crone and colleagues have landed on a tentative “yes” to both questions. 

Over three months, the study participant operated computer applications with six intuitive speech commands via an ECoG implant placed over the sensorimotor cortex areas responsible for speech. Researchers found he was able to use the system at his own pace with a high degree of accuracy, and without them recalibrating or retraining the model.

“Overall,” they write, “our work demonstrates the potential of safely using implanted BCIs for intuitive control of external devices for a prolonged time.” 

This study is part of the ongoing CortiCom trial, which is now recruiting two more participants at Johns Hopkins and two at the partnering Utrecht University in the Netherlands. 

With the current Johns Hopkins participant, Crone and his lab have also conducted research translating the man’s brain activity into synthesized speech that sounds like his real voice. In another study, the man spelled words via “brain clicks” generated by attempted grasping movements to select letters on an electronic switch scanner. 

“It’s baby steps,” Crone says. “These are the first.”

Johns Hopkins is working at the forefront of the implantable BCI field, one of a handful of institutions around the world conducting this research. Crone notes that the foundational science comes from past functional mapping research from Johns Hopkins among patients undergoing epilepsy surgery.

In today’s world of implantable BCIs, which Crone and colleague Nick Ramsey at Utrecht University review in a recent Nature article, it’s not yet clear whether ECoG or a complementary technology, microelectrode arrays (MEAs), will best serve the needs of patients.

While MEAs — inserted directly into brain tissue — capture more precise information, the signals tend to be more unstable. In the future, Crone will act as principal investigator in another NIH-funded trial on a new type of MEA device, collaborating with the Johns Hopkins University Applied Physics Laboratory.

“Ultimately, we hope that our research will make BCIs more mainstream in clinical practice,” Crone says, “which will help many more patients.”

For more information, visit hopkinsmedicine.org/neurology-neurosurgery/clinical-trials/brain-computer-interface.

 


© The Johns Hopkins University, The Johns Hopkins Hospital, and Johns Hopkins Health System. All rights reserved.

Powered by BROADCASTMED