Be My Eyes Releases AI-Powered Visual Assistant for Blind and Low Vision Users

Be My Eyes Releases AI-Powered Visual Assistant for Blind and Low Vision Users
Image Credit: Be My Eyes

Be My Eyes, the innovative mobile app that connects blind and low vision users with sighted volunteers for visual assistance, has announced the launch of Be My AI, an AI-powered visual recognition tool now available in open beta to hundreds of thousands of iOS app users.

Utilizing state-of-the-art AI image recognition technology from OpenAI's recently announced multimodal GPT-4V, Be My AI provides detailed descriptions of photos taken by blind users to assist with everyday tasks and situations. The rollout began this week and will continue over the next several weeks as the feature is enabled for current Be My Eyes iPhone app users.

Blind users can simply open the app, click the Be My AI tab, and take a picture to receive a real-time detailed audio description. Users can then ask follow up questions to get more context. According to Be My Eyes, common uses during testing included reading packaging, instructions, and appliance controls, as well as getting descriptions of surroundings, artwork, and social media images. It also assists deaf-blind users by producing written responses in 29 languages, compatible with braille displays.

An example provided by Sean Dilley of the Be my AI assistant describing a photograph taken from his balcony in Washington DC

Be My AI represents a major step in making visual information more accessible for the blind community. However, Be My Eyes stresses it is not intended to replace critical mobility tools like white canes or guide dogs. The app also still connects users to human volunteers for visual assistance when needed.

Be My AI has been in development for 7 months, incorporating feedback from over 19,000 blind beta testers to refine the user experience. While AI accuracy continues improving, Be My Eyes acknowledges there will still be errors common with early stage AI. They encourage users to provide feedback to improve the system.

Feedback from the blind community has been instrumental in shaping this innovation. Sarah, a user, shared her experience: "Chat GPT seemed a distant tool, but with its integration into Be My Eyes, I decided to try. It's been invaluable in deciphering images on social media platforms where descriptions are scarce."

An Android version is also nearing open beta release in coming months. With over 6.9 million volunteers, Be My Eyes hopes this latest feature will further their goal of making the world more accessible. While AI comes with limitations, Be My AI represents an exciting new hands-free option to obtain visual information.

This provides blind and low vision users greater independence and convenience in navigating daily life. As Be My Eyes' first major AI integration, it likely signals more innovative accessibility focused AI applications to come.

Chris McKay is the founder and chief editor of Maginative. His thought leadership in AI literacy and strategic AI adoption has been recognized by top academic institutions, media, and global brands.

Let’s stay in touch. Get the latest AI news from Maginative in your inbox.

Subscribe