Search

Apple's Vision Pro Might Be The World's First Brain Chip Interface (BCI) - Analytics India Magazine

kembaliui.blogspot.com

Listen to this story

Apple took the world by storm when it announced its first ever ‘spatial computer’ with the Vision Pro. While most marvelled about the technical specifications and the sleek engineering of the device, one key aspect went unnoticed: the control interface of the device. 

During the demo of the product, many of its users were shown controlling the device using just their hands, thanks to the depth sensors mounted on the bottom of the device. However, Apple then revealed that the primary way of interfacing with the device would be through the eyes. 

The internal camera array mounted around the inside of the headset would be able to accurately gauge what input the user wants by detecting the movement of their eyes. However, to get the system to work consistently and accurately, Apple employed some machine learning magic. These algorithms might lay the foundation for future brain computer interfaces (BCI), and make the Apple Vision Pro a proto-BCI. 

5000 patents for a reason

Brain computer interfaces have long been a sci-fi pipe dream, with the most well-known example being Elon Musk’s Neuralink. This BCI aims to work through surgical implantation of a chip into the human brain, allowing it to convert thoughts to software commands. However, it now seems that Apple has found a way to non-invasively detect human thought, through the combination of neurotechnological solutions with machine learning algorithms.

Delving into patents filed by Apple for the Vision Pro headset, we can see that the company has filed a patent called ‘eye-gaze based biofeedback‘. This patent, filed in Europe, aims to determine a user’s attentive state when they are viewing a certain type of content. Apple has stated that this can be used for tracking user biofeedback for XR experiences, which in turn can be used to make the experiences richer. 

An example provided in the patent shows that it is possible to predict the users’ required interaction based on the dilation of their pupils. Also, the colour of the UI element can change depending on which colour triggers a better pupillary response, ensuring a higher degree of success for the system. Similarly, patents have also been filed for a system that can determine the user’s state by sensing the effect of luminary changes on the pupils of the user. Through this, the system can determine if the user is inattentive, and if they are, increase the luminance of a particular UI element or part of the content to retain user attention.

Pupils are just one of the ways Apple researchers found to determine the users’ mental state. A patent was also filed for ‘sound-based attentive state assessment‘, which can assess the users’ response to a sound to determine their mental state. The sensors can also use other measurements like heart beats, muscle activity, blood pressure, and electrical brain activity, to provide more information. 

Through the piecemeal picture painted through the patents, we can assess the shape of the Vision Pro’s UI. For example, these biofeedback assessment systems could work in tandem, passing back information about the user’s mental state back to the headset in tandem. This will allow the computer to form a more comprehensive picture of what the user is ‘thinking’ or ‘feeling’, allowing it to adjust the content accordingly. 

The lead inventor on these patents is Sterling Crispin, a software engineer working on neurotechnology for the Apple Vision Pro. In a tweet, he delved deeper into this interface, stating, 

“Other tricks to infer cognitive state involved quickly flashing visuals or sounds to a user in ways they may not perceive, and then measuring their reaction to it…It’s a crude brain computer interface via the eyes, but very cool.”

The way forward for BCI?

 As Crispin stated, this is a preferred way of interfacing with computers rather than undergoing an invasive brain surgery. Indeed, the problems that Musk has been facing with Neuralink, such as the large number of killed animals and difficulty getting to human trials just show the amount of work that still needs to be done for a true BCI. 

The Vision Pro’s solution is not only a distinctly Apple solution, but it is also the forerunner for the way we will interact with computers in the future. AI and ML is the biggest contributing factor to bridge the gap between organic brains and inorganic processors. 

Take voice inputs for example. What was once a hard problem to solve requiring multiple powerful computing resources is now possible even on edge devices like mobile phones thanks to their on-device processing capabilities. This advancement was also made possible thanks to light-weight speech-to-text algorithms, which continue to get better with time. 

By bringing together machine learning algorithms with highly advanced sensors and scanners, the Vision Pro is able to read the mental state of the users, qualifying it as a brain computer interface. Similar to how STT algorithms revolutionised voice interfaces, it seems that Crispin and Apple’s research holds the potential to revolutionise visual interfaces. What’s more, the future of this technology might become the go-to way to interact with computers, leaving keyboards and mice in the dust.

Adblock test (Why?)



"interface" - Google News
June 14, 2023 at 05:04PM
https://ift.tt/tAYRFSL

Apple's Vision Pro Might Be The World's First Brain Chip Interface (BCI) - Analytics India Magazine
"interface" - Google News
https://ift.tt/ljxH4ct
https://ift.tt/1JqwZ6M

Bagikan Berita Ini

0 Response to "Apple's Vision Pro Might Be The World's First Brain Chip Interface (BCI) - Analytics India Magazine"

Post a Comment

Powered by Blogger.