Meta Unveils Early Access Program for AI Capabilities for Ray-Ban Smart Glasses

The new features use the glasses' built-in camera coupled with Meta's AI assistant to understand what the user is looking at and provide helpful information on demand.

Meta Unveils Early Access Program for AI Capabilities for Ray-Ban Smart Glasses
Image Credit: Meta

After months of anticipation, Meta has announced the launch of an early access program that will allow Ray-Ban Meta smart glasses customers to test out upcoming AI-powered features. This marks the first time Meta's splashy multimodal AI capabilities designed specifically for the stylish augmented reality eyewear will be available to try.

The new features use the glasses' built-in camera coupled with Meta's AI assistant to understand what the user is looking at and provide helpful information on demand.

Firstly, the company is beginning to roll out the ability for their AI assistant on the glasses to retrieve real-time information powered by Bing. This feature enables users to ask about sports scores, local landmarks, and even stock information. For instance, you could ask, "Hey Meta, who won the Boston Marathon this year in the men's division?" or seek the nearest pharmacy. This real-time search function is currently being rolled out in phases to U.S. customers.

A new multimodal, AI-powered "Look and ask with Meta AI" feature in beta is particularly intriguing. It allows users to engage with their environment in novel ways. By simply saying "Hey Meta, look and…" or snapping a photo and following up with a question within 15 seconds, users can interact with their surroundings like never before. This could range from translating signs in foreign languages to getting tips on meal planning or gardening.

When you ask a question about your visual surroundings, the glasses capture a photo and send it to Meta’s cloud for AI processing. The response is then delivered audibly through the glasses. Importantly, for now, Meta advises that "All photos processed with AI are stored and used to improve Meta products, and will be used to train Meta’s AI with help from trained reviewers." While it may not provide much consolation for the privacy-conscious, the company ways all information is collected, used and retained in accordance to their Privacy Policy.

The capabilities showcase the potential for hands-free access to AI in assisting with creative tasks, answering questions, and controlling devices via voice commands. However, Meta cautions that as an initial test, the AI may not always function perfectly. The early access program is intended to gather customer feedback to improve the smart glasses' AI over time before full integration.

Early access is currently only open to Ray-Ban Meta smart glasses owners based in the US.

By expanding the device's AI capacities even at an early stage, Meta aims to deliver more value from its investment in the intersection of wearable technology and intelligent assistance. The company ultimately hopes to pioneer intuitive AI-powered assistance that feel like natural extensions of the human experience rather than detached gadgets. Still, perfecting such ambitious machine learning applications in the real world will inevitably take extensive testing and user input.

Let’s stay in touch. Get the latest AI news from Maginative in your inbox.

Subscribe
Mastodon