Search

Brain-Computer Interface Enables Mind Control of Robot Dog - Psychology Today

kembaliui.blogspot.com
Tumisu/Pixabay
Tumisu/Pixabay

A new peer-reviewed study published in ACS Applied Nano Materials demonstrates a new type of AI-enabled brain-machine interface (BMI) featuring noninvasive biosensor nanotechnology and augmented reality that enables humans to use thoughts to control robots with a high degree of accuracy. University of Technology Sydney (UTS) researchers who authored the study wrote,

Brain-machine interfaces (BMIs) are hands-free and voice-command-free communication systems that allow an individual to operate external devices through brain waves, with vast potential for future robotics, bionic prosthetics, neurogaming, electronics, and autonomous vehicles.

The artificial intelligence (AI) renaissance with the improved pattern-recognition capabilities of deep neural networks is contributing to the acceleration of advances in brain-machine interfaces, also known as brain-computing interfaces (BCIs). AI deep learning helps find the relevant signals in the noisy brain activity data.

The neural activity of the human brain is recorded using sensors. Brain-computer interfaces are either invasive and surgically implanted in the brain or noninvasive wearable devices.

Among noninvasive sensors, there are dry and wet sensor types. Dry sensors enable portable electroencephalography (EEG) beyond the clinical setting. However, electrical contact on a hairy scalp is difficult and does not perform as well as wet sensors.

On the other hand, wet sensors perform better than dry ones. Wet sensors require a conductive gel electrolyte to be applied directly on the scalp and hair, which can cause unpleasant side effects such as allergic reactions, skin rash, sullying of the hair, and increased infection risk. Moreover, the conducting gel eventually dries up, so it’s less than ideal for long-term use.

The sensor that the researchers used for this study is a dry sensor consisting of graphene, specifically noninvasive epitaxial graphene (EG) grown on silicon carbide on silicon for detecting the EEG signals with high sensitivity.

“The patterned epitaxial graphene sensors show efficient on-skin contact with low impedance and can achieve comparable signal-to-noise ratios against wet sensors,” the scientists wrote.

Epitaxy is the growth of a thin layer on the surface of a crystal in a manner where that layer has the same structure as the underlying crystal. Graphene is a two-dimensional (2D) material consisting of one atomic layer of graphite with a thickness of roughly 03 nanometers or about one hundred thousandths of a strand of hair. The single layer of carbon atoms is structured in a honeycomb lattice and has many desirable properties, such as excellent conductivity, strength, flexibility, and transparency.

“To solve the challenge of detecting the EEG signal from the hairy and highly curved occipital scalp, we have fabricated and demonstrated three-dimensional micropatterned EG sensors in a complete BMI system,” the researchers wrote.

The researchers used a noninvasive dry sensor system that can capture brain activity of the occipital lobe, the smallest of four lobes at the back of the brain connected to the retina of the eyes and involved in visual processing and visual memory. The occipital lobe processes visual stimuli from the eye’s retinas associates it with visual memory, then passes the processed information to other parts of the brain.

“The occipital region, corresponding to the visual cortex of the brain, is key to the implementation of BMIs based on the common steady-state visually evoked potential paradigm,” wrote the scientists.

The study participant was trained on an augmented brain–robot interface (aBRI) platform and brain-computer interface applications via a mobile phone interface. The researchers captured EEG signals recorded by eight-channel hexagonal (HPEG) patterned graphene sensors from the occipital region of the participant as they interacted with a robot dog using a Microsoft HoloLens, a head-mounted augmented reality (AR) and mixed reality (MR) headset.

The biosensor detects when the user, wearing the HoloLens, is visually concentrating on a specific square among a group of flickering squares. An AI-enabled decoder helps to convert the signal into commands that operate a robotic dog–the Quadrupedal Unmanned Ground Vehicles, or Q-UGVs, made by Ghost Robotics. According to the scientists, their brain-machine interface solution is able to control and command the movement of a robotic dog with 94 percent accuracy.

Copyright © 2023 Cami Rosso All rights reserved.

Adblock test (Why?)



"interface" - Google News
March 22, 2023 at 06:48AM
https://ift.tt/T0fyeb4

Brain-Computer Interface Enables Mind Control of Robot Dog - Psychology Today
"interface" - Google News
https://ift.tt/FUbXSCy
https://ift.tt/iksFnou

Bagikan Berita Ini

0 Response to "Brain-Computer Interface Enables Mind Control of Robot Dog - Psychology Today"

Post a Comment

Powered by Blogger.