Google has introduced some new AI features that are both interesting and useful for people with visual (and other) impairments. One of these features is an “image question and answer” capability in the Lookout app for Android. Basically, this feature lets you “chat” with the app and ask it everything you want to know about a photo you’ve opened.
Lookout is Google’s app designed to help blind and visually impaired people learn about their surroundings. The new feature uses advanced AI developed by DeepMind to provide detailed descriptions of images that don’t have captions or alt text. Google gives an example of a dog photo. The app detects the photo and describes it. But if you have any additional questions, you can ask the app, either by typing or using your voice. Google is currently testing this feature with a small group of blind and visually impaired people and they plan to make it available to a broader audience soon.
Other than answering questions about photos, Google has introduced a few other features that make day-to-day life easier for people with visual and other impairments. For instance, Google Maps now shows ” Accessible Places” to everyone instead of having people opt-in to the feature to see wheelchair-accessible places. This is not useful only for wheelchair users, but also for those who push a stroller, lug their suitcase, or try to bring their 25-year-old bicycle inside the building (yes, the last one is me).
The new image description feature reminded me of the eerie app that lets you have an “imaginary friend” built with AI, only this one is not a gimmick. It’s actually quite useful, and I believe people will use it just like Google Lens and the cool features it offers.