Microsoft’s new accessibility app is a huge step for AI

Microsoft’s new accessibility app is a huge step for AI

Microsoft’s Seeing AI app for iOS might be its most ambitious yet. Building off of the incredible smart glasses concept shown off at Build 2016 that narrated the world’s happenings for low-vision individuals, this app brings that same power to iPhones starting today.

Seeing AI utilizes the rear facing camera to detect a rather wide variety of visuals in proximity by using artificial intelligence. It works to paint a picture of what’s happening to boost the user’s awareness and empowers vision-impaired individuals with rich information that many sighted individuals take for granted. 

The app’s capabilities are impressive. It can recognize general objects, products, and can even describe the layout of a scene. If you want to read a sign or what’s typed out on a paper, Seeing AI can do that, too. But more impressive, this app can find faces and parse their various expressions, approximated age, general emotion and more.

AI see you

For several years, smartphone cameras have been able to see into our worlds and capture a photo or video. But only recently has AI-powered software actually been able to tell what the camera is seeing. 

At Google IO 2017, Google Lens was unveiled, an injection of AI smarts into the workings of the modern smartphone camera that can do things like remove unwanted obstructions from photos. Object recognition of this nature is also a pillar of Samsung Bixby, Samsung’s personal assistant that debuted alongside the Samsung Galaxy S8.

Given the recent maturation in artificial intelligence employed in smartphones, Seeing AI is the next logical step in not just recognizing objects, but experimenting with context and providing the right sort of feedback given the scenario. In order to serve those who can’t see, Microsoft’s app dives into the situational and contextual details to provide its users with information that can inform their next move.

The genius of Seeing AI isn’t just in how it can, well, see the world. It’s how it provides audible feedback to users. While the app is open, it’s constantly providing audio feedback, whether it’s in the form of a voice guiding users to move the camera in a certain direction to get the best look at their world or a series of beeps that ramp up in intensity when Seeing AI is close to recognizing an object.

There’s no word as of yet if Seeing AI will arrive on Android, but it’s likely to come sooner than later. On Microsoft’s YouTube page for the Seeing AI prototype, it stated that the decision to change focus from a smartglasses prototype to an app was to get into as many hands as possible, so fingers are crossed for a short wait on Android.

Read More…

Comments are closed.