The Meta Ray-Ban smart glasses just got a big AI update, but you should act fast to get access
We're one step closer to true "smart glasses," but this multimodal AI tool is only in U.S. early access for now.
What you need to know
- The Ray-Ban Meta smart glasses can now analyze a photo to tell you what you're looking at.
- The new multimodal AI features also include Bing search results to your queries.
- These features are only available via an early-access beta in the U.S., but will roll out globally next year.
At Meta Connect 2023, CEO Mark Zuckerberg promised that the company's new Ray-Ban Meta smart glasses would be able to analyze your surroundings, similar to Google Lens, starting in 2024. Now it turns out that U.S. smart glasses owners can get a head start if they're quick.
On Instagram, Zuckerberg showed off the Ray-Ban Meta smart glasses' new multimodal AI tech in several use cases. He asked the Meta AI what to wear with a striped shirt, what type of fruit something was, and to translate a Spanish meme into English.
A post shared by Mark Zuckerberg (@zuck)
A photo posted by on
Specifically, you'll say "Hey Meta," wait for the acknowledging ding that your glasses are listening, and then say "Look and tell me [blank]" to get information about it.
In each case, the smart glasses have to take a photo of a thing to analyze it, most likely because it's not the Ray-Bans themselves analyzing the data: it has to be sent to the cloud through your smartphone, where some combination of Meta's AI tech and the Bing search engine will answer your questions.
In theory, though, you probably won't mind having a photo of whatever it was that confounded you, so this shouldn't be a deal-breaker for Ray-Ban Meta owners.
Meta's blog announcement post explained that the Meta AI and Bing will answer questions about "sports scores or information on local landmarks, restaurants, stocks and more." It also emphasized that "these multimodal AI features may not always get it right," as they use beta testers to spot bugs.
If you own the Meta Ray-Ban smart glasses and want to test the feature for yourself, you'll need to enable Early Access mode in the Meta View app. You'll find the simple instructions at this link.
Be an expert in 5 minutes
Get the latest news from Android Central, your trusted companion in the world of Android
A post shared by Andrew Bosworth (Boz) (@boztank)
A photo posted by on
Meta CTO Andrew Bosworth told Instagram users that this Early Access is "limited to a small number of people who opt in," which could mean Meta could cut off access if enough people sign up for the test. The feature fully launches next year sometime, but you should sign up now if you don't want to wait.
We praised the Ray-Ban Meta smart glasses in our review, especially for its high-res portrait camera and solid audio. One negative we noted was that the "Hey Meta" command could only complete simple commands, struggling with more complicated queries.
That's why this news seems extremely promising, as a way to close up one of its main weak points and help it surpass most of the other smart glasses on the market.
Meta's Ray-Bans come in multiple styles and colors, let you take stylish and high-res photos and share them instantly to Instagram, and now have proper AI tech (in beta) to live up to the potential shown off at Meta Connect. They're definitely among the best smart glasses available today.
Michael is Android Central's resident expert on wearables and fitness. Before joining Android Central, he freelanced for years at Techradar, Wareable, Windows Central, and Digital Trends. Channeling his love of running, he established himself as an expert on fitness watches, testing and reviewing models from Garmin, Fitbit, Samsung, Apple, COROS, Polar, Amazfit, Suunto, and more.