Brain-computer interfaces (BCIs), also known as brain machine interface (BMIs), convert brain activity into outputs that enables lost or impaired functions such as movement or speech. BCIs have been used to help those severely paralyzed or impaired to communicate due to stroke, spinal cord injuries, amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig’s disease, and other conditions. A new study published recently in Nature led by researchers from Stanford University demonstrates a novel paradigm for brain-computer interfaces using artificial intelligence (AI) to decodes brain activity from thinking of handwriting movements into text in real-time.
The lead author of the study was Frank Willett, PhD, a research scientist at the Howard Hughes Medical Institute (HHMI) at Stanford University, and the co-authors include Krishna Shenoy, PhD, an HHMI Investigator and Professor at Stanford University, Leigh R. Hochberg, MD, PhD and Senior Lecturer on Neurology at Harvard Medical School, and Stanford researcher Donald T. Avansino. The National Institute of Health’s Brain Research Through Advancing Innovative Neurotechnologies® (BRAIN) Initiative, the National Institute of Neurological Disorders and Stroke (NINDS), and the National Institute on Deafness and other Communication Disorders (NIDCD) provided funding for the study.
Mental-Handwriting-to-Text: A New Paradigm for BCIs
“So far, a major focus of BCI research has been on restoring gross motor skills, such as reaching and grasping or point-and-click typing with a computer cursor,” the researchers wrote. “However, rapid sequences of highly dexterous behaviors, such as handwriting or touch typing, might enable faster rates of communication. Here we developed an intracortical BCI that decodes attempted handwriting movements from neural activity in the motor cortex and translates it to text in real time, using a recurrent neural network decoding approach.”
For this study, a right-handed 65-year-old man was outfitted with two 96 microelectrode intracortical arrays to record neural signals. Specifically, Blackrock Microsystems’ NeuroPort™ arrays with 1.5 mm electrodes were placed in the left-hemisphere precentral gyrus area of the study participant’s brain. The participant had a spinal cord injury from nine years prior to the study enrollment, leaving him with extremely limited voluntary motion of the limbs.
The researchers used software developed via MATLAB and Simulink for operating the recording data and real-time decoding. Brain activity data was collected as the participant was tasked to attempt handwriting sentences over the course of multiple sessions. The decoder was trained using the session data.
AI Deep Learning: Recurrent Neural Network (RNN)
The study used a two-layer gated recurrent unit recurrent neural network (RNN) to convert the participant’s brain activity into a time series of character probabilities and used forced-alignment labeling for training the decoders. The RNN was trained to predict the character probabilities from a one-second time delay to allow for system processing time.
In artificial intelligence, recurrent neural networks are a class of artificial neural networks often used in natural language processing, speech recognition, and whenever there is a need for contextual data in order to provide decision results based on the input data. Artificial neural networks are somewhat inspired by the biological brain with an architecture consisting of layers of interconnected nodes called artificial neurons. The deep learning algorithms of RNNs are able to process sequences of inputs using an internal state that acts as memory, where the inputs are related to each other. The relations are “remembered” while the recurrent neural network is learning. RNNs are useful for scenarios where modeling non-linear temporal or sequential relationships are needed.
“With this BCI, our study participant, whose hand was paralyzed from spinal cord injury, achieved typing speeds of 90 characters per minute with 94.1% raw accuracy online, and greater than 99% accuracy offline with a general-purpose autocorrect,” the researchers reported. “To our knowledge, these typing speeds exceed those reported for any other BCI and are comparable to typical smartphone typing speeds of individuals in the age group of our participant (115 characters per minute).”
Copyright © 2021 Cami Rosso All rights reserved.
"interface" - Google News
July 06, 2021 at 08:38PM
https://ift.tt/3xhr81U
AI Brain-Computer Interface Turns Mental Handwriting To Text - Psychology Today
"interface" - Google News
https://ift.tt/2z6joXy
https://ift.tt/2KUD1V2
Bagikan Berita Ini
0 Response to "AI Brain-Computer Interface Turns Mental Handwriting To Text - Psychology Today"
Post a Comment