This Is Definitely An Ethical Use Of AI
Apple has rolled out a new feature on iOS 14, called Screen Recognition, which will improve the experience of visually impaired people when using iThangs. They fed thousands of images of icons, buttons and apps in use into a machine learning algorithm to train it to be able to immediately identify and label the various GUI elements on just about any app. This would allow an iThang to label elements and read out the description or location on demand, enhancing the experience of blind users.
Many apps already have modes which can be enabled for visually impaired users and Screen Recognition would not override those labels but would ensure any labels the designer missed would be covered. Instead this new feature would enhance any existing labelling and add missing features such as audible descriptions of pictures.
This would not have been possible a few years ago, not just because we hadn’t developed training algorithms advanced enough to accurately recognize images but also because this takes a fair amount of processing power to do in real time. The difficult of training means that we won’t see this on other platforms, including Macs, for a while but for now this is a great feature for mobile Apple devices. If you know someone who would benefit from Screen Recognition, or are simply curious about it’s accuracy and overall design you should check it out.
"interface" - Google News
December 05, 2020 at 01:11AM
https://ift.tt/36IN4be
Screen Recognition On iPhone, Improving The Interface For Blind Users - PC Perspective
"interface" - Google News
https://ift.tt/2z6joXy
https://ift.tt/2KUD1V2
Bagikan Berita Ini
0 Response to "Screen Recognition On iPhone, Improving The Interface For Blind Users - PC Perspective"
Post a Comment