Microsoft Expands Collaboration with Be My Eyes to Improve Accessible AI

Microsoft Expands Collaboration with Be My Eyes to Improve Accessible AI

Microsoft is partnering with Be My Eyes to make its AI more inclusive for the 340 million people worldwide who are blind or have low vision. The collaboration aims to improve scene understanding in AI models, addressing a critical gap in current technology.

Why does this matter? Well, many AI models today struggle with disability-related objects and scenarios. Data quality is crucial in AI training, yet the voices and experiences of people with disabilities are frequently underrepresented or misrepresented in existing datasets. This so-called "disability data desert" leads to AI systems missing essential context, which ultimately limits their utility and reinforces bias.

For instance, Microsoft Research found that objects like braille devices are recognized about 30% less accurately in popular image datasets. This accuracy gap limits AI's usefulness for people with visual impairments and can even perpetuate harmful stereotypes.

Be My Eyes, known for connecting visually impaired users with sighted volunteers through video calls, will provide Microsoft with unique video datasets. These videos capture real-world scenarios faced by blind and low-vision individuals, including challenging lighting conditions and uncommon objects.

Privacy is a top priority in this collaboration. Be My Eyes will remove all personal information from the video metadata before sharing it with Microsoft. Be My Eyes users will also have the option to opt out of data sharing, ensuring transparency and control over their data.

This isn't the first time these companies have worked together. Their partnership dates back to 2017 when Be My Eyes was integrated into Microsoft's Disability Answer Desk. Last year, Microsoft piloted Be My Eyes' AI-powered support feature, and more recently, Be My Eyes launched a Windows app. These projects have provided invaluable learning opportunities and laid the groundwork for today’s expanded collaboration.

Jenny Lay-Flurrie, Chief Accessibility Officer at Microsoft, framed this initiative as part of a broader push towards responsible AI. “We live in a world that isn’t always designed for disabled people, and that’s reflected in the datasets used to train AI systems. Collaborations like ours with Be My Eyes are how we close the data gap and make AI more inclusive. It’s about raising the bar for what technology can and should do for everyone.”

Accessible AI isn’t just about developing tools for a specific user group—it’s about reshaping the way technology interacts with all of us. Remember disability can be situational, temporary, or permanent. From better scene descriptions to improved voice interaction for those with diverse speech patterns, AI that is trained on inclusive data can dramatically enhance the lives of individuals across the disability spectrum.

As AI technology, tools, and services become more pervasive and embedded in our lives, it is crucial that accessibility is prioritized from the start, not treated as an afterthought.

Chris McKay is the founder and chief editor of Maginative. His thought leadership in AI literacy and strategic AI adoption has been recognized by top academic institutions, media, and global brands.

Let’s stay in touch. Get the latest AI news from Maginative in your inbox.

Subscribe