Search

The evolution of brain-computer interface devices - Marin Independent Journal

kembaliui.blogspot.com
Dr. Sal Iaquinta

The idea of controlling something outside of your body using just your mind sounds like science fiction, but it’s slowly becoming science reality through the brain-computer interface. This isn’t exactly new. Some of you may remember way back in 2005 hearing about researchers surgically implanting a brain interface in a quadriplegic patient so that his thoughts could move a robotic hand in basic motions.

The discovery of the brain’s electrical signals is about a century old. About 50 years later, researchers demonstrated that a person could move a cursor on a computer screen using just their monitored brainwaves, in other words, their thoughts. It was difficult to take the next logical step, typing via thoughts, due to a lack of technology, specifically, the inability to have a sensor that could read the brain’s thoughts of single letters. In the meantime, eye-tracking software developed such that people with limited mobility, such as the late Stephen Hawking, could control devices using just the movement of their eyes. Some users can type 40 to 50 characters a minute moving just their eyes across a computer screen and choosing letters.

A research group at Stanford recently took a new approach to using brainwaves to write. Rather than thinking of letters or thinking of pointing to options on a screen, the researchers in some ways went back to the 2005 study, thinking about hand movements, specifically, handwriting. The signals from the implanted BCI were analyzed by a learning computer. The user thought of writing each letter and the computer learned the associated brainwaves associated with writing each letter. Obviously, each letter has its own unique series of hand movements to write it, but our brains do not create the exact same measurable signals every time we think about writing that letter. In some way it is a resolution problem, like being at the eye doctor’s office and trying to distinguish between the fuzzy letters on the lowest line on the acuity chart. You can often narrow it down to one or two choices, and the computer was able to to the same, and get it right the vast majority of the time. Coupled with an autocorrect program, the user in their study was able to type at a speed of 90 letters a minute with 99% accuracy. As a point of reference, the average person types at about 190 characters per minute with a decrease in older age.

One of the lessons learned from this study was that more complex movements are easier to distinguish from one another. In hindsight, this seems obvious as the researchers are asking a series of brainwaves to “code” for a single output, the same way we wouldn’t confuse a “Q” for a “Z”.

External sensors to read brainwaves through the scalp and skull have evolved to make the technology wireless. This makes the technology scalable and new applications are developing now that patients don’t have to have a neurosurgeon surgically open up their head to implant the interface (something most of us, Elon Musk excepted, are not excited about). Musk did make the news for founding Neuralink, a company with the goal of helping paralyzed people control devices via an implanted computer chip, with the ultimate goal of augmenting everyone.

A Spanish startup company called Bitbrain has made a wearable headband device that can “read” drivers’ intentions and process a response faster than people can act. Its site says that its device will activate a car’s brakes 0.4 to 1 second sooner that a person can execute the thought of braking and actually press the pedal. This tiny fraction of time makes a difference of 100 kilometers an hour (62 mph) because it can decrease the stopping distance up to 27 meters (88 feet). That difference can be a life saved.

Lastly, entertainment often follows the leading edge of technology. NextMind is a company that created a device that allows you to control your computer with your mind. The user wears a headband that has a monitor on the back of their head. It senses visual cortex brainwaves. Objects on the computer screen have a faint visual overlay and when the user focuses on them, it activates the object on the screen. Its interface can also work with VR headsets and one of its early demos was a car-racing video game in which the car was controlled by the driver’s thoughts.

Brain interface devices will continue to offer severely handicapped people ways to interact with the world and the technology spillover from that noble goal will benefit all of us. Saving people from dozing off while at the wheel will save lives, your boss knowing you feel like dozing off during a meeting probably won’t, but both are coming at a price a lot higher than a penny for your thoughts.

Dr. Salvatore Iaquinta is a head and neck surgeon and the author of “The Year They Tried To Kill Me.” He takes you on the Highway to Health every fourth Monday.

Adblock test (Why?)



"interface" - Google News
May 24, 2021 at 02:04AM
https://ift.tt/3bPCdhV

The evolution of brain-computer interface devices - Marin Independent Journal
"interface" - Google News
https://ift.tt/2z6joXy
https://ift.tt/2KUD1V2

Bagikan Berita Ini

0 Response to "The evolution of brain-computer interface devices - Marin Independent Journal"

Post a Comment

Powered by Blogger.