Researchers from University of California San Diego have found a solution to distinguish amongst hand gestures that individuals are making by examining only data from noninvasive brain imaging, without information from the hands themselves. The outcomes are an early step in developing a non-invasive brain-computer interface that will in the future allow patients with paralysis, amputated limbs or other physical challenges to make use of their mind to manage a tool that assists with on a regular basis tasks.
The research, recently published online ahead of print within the journal Cerebral Cortex, represents one of the best results up to now in distinguishing single-hand gestures using a totally noninvasive technique, on this case, magnetoencephalography (MEG).
“Our goal was to bypass invasive components,” said the paper’s senior creator Mingxiong Huang, PhD, co-director of the MEG Center on the Qualcomm Institute at UC San Diego. Huang can also be affiliated with the Department of Electrical and Computer Engineering on the UC San Diego Jacobs School of Engineering and the Department of Radiology at UC San Diego School of Medicine, in addition to the Veterans Affairs (VA) San Diego Healthcare System. “MEG provides a secure and accurate option for developing a brain-computer interface that might ultimately help patients.”
The researchers underscored some great benefits of MEG, which uses a helmet with embedded 306-sensor array to detect the magnetic fields produced by neuronal electric currents moving between neurons within the brain. Alternate brain-computer interface techniques include electrocorticography (ECoG), which requires surgical implantation of electrodes on the brain surface, and scalp electroencephalography (EEG), which locates brain activity less precisely.
With MEG, I can see the brain considering without taking off the skull and putting electrodes on the brain itself. I just need to put the MEG helmet on their head. There are not any electrodes that might break while implanted inside the top; no expensive, delicate brain surgery; no possible brain infections.”
Roland Lee, MD, study co-author, director of the MEG Center on the UC San Diego Qualcomm Institute, emeritus professor of radiology at UC San Diego School of Medicine, and physician with VA San Diego Healthcare System
Lee likens the protection of MEG to taking a patient’s temperature. “MEG measures the magnetic energy your brain is putting out, like a thermometer measures the warmth your body puts out. That makes it completely noninvasive and secure.”
Rock paper scissors
The present study evaluated the power to make use of MEG to differentiate between hand gestures made by 12 volunteer subjects. The volunteers were equipped with the MEG helmet and randomly instructed to make one in all the gestures utilized in the sport Rock Paper Scissors (as in previous studies of this sort). MEG functional information was superimposed on MRI images, which provided structural information on the brain.
To interpret the information generated, Yifeng (“Troy”) Bu, an electrical and computer engineering PhD student within the UC San Diego Jacobs School of Engineering and first creator of the paper, wrote a high-performing deep learning model called MEG-RPSnet.
“The special feature of this network is that it combines spatial and temporal features concurrently,” said Bu. “That is the foremost reason it really works higher than previous models.”
When the outcomes of the study were in, the researchers found that their techniques might be used to differentiate amongst hand gestures with greater than 85% accuracy. These results were comparable to those of previous studies with a much smaller sample size using the invasive ECoG brain-computer interface.
The team also found that MEG measurements from only half of the brain regions sampled could generate results with only a small (2 – 3%) lack of accuracy, indicating that future MEG helmets might require fewer sensors.
Looking ahead, Bu noted, “This work builds a foundation for future MEG-based brain-computer interface development.”
Source:
University of California – San Diego
Journal reference:
Bu, Y., et al. (2023) Magnetoencephalogram-based brain-computer interface for hand-gesture decoding using deep learning. Cerebral Cortex. doi.org/10.1093/cercor/bhad173.