logo

Meta’s Ray-Ban smart glasses just got smarter

By Abby Montanez 23 December, 2024

The specs are also able to identify songs with Shazam

Meta’s mixed-reality Ray-Bans now come with even more AI-powered features thanks to a new software update, the Verge reported. The specs, which already include voice command and let wearers stream Spotify, will now integrate Shazam and live AI. Plus, users can translate foreign language conversations in real time.

Meta CEO Mark Zuckerberg introduced the features earlier this year during a developer conference dubbed Connect 2024. The Live AI assistance uses the glasses’ built-in camera to weigh in and offer hands-free suggestions on your surroundings without having to say, “Hey Meta.” For example, it can suggest a recipe based on the ingredients in front of you, give advice on gardening, or recommendations for activities in your area. According to the company, the AI feature will last for approximately 30 minutes at a time before needing a recharge.

There’s also live translation in which you’ll be able to translate speech between English and Spanish, French, or Italian. It’s ultimately up to you whether you want to hear what the other person is saying through the glasses’ open-ear speakers, or you can view the conversation as a transcript on your phone. “Not only is this great for travelling, it should help break down language barriers and bring people closer together,” Meta wrote on its blog.

Another feature is the ability to identify songs using Shazam instead of relying on the Android and iOS apps. If you want to know what music is playing near you, Shazam will get the answer. All you have to do is ask the prompt: “Hey Meta, what is this song?” The only caveat of the software upgrade is that the two of the features—the live AI and translation—are limited to members of Meta’s Early Access Program. In addition, all three are only available to people living in the U.S. and Canada, at least for now.

Meta isn’t the only brand throwing its hat in the smart-glasses ring. Last week, Google and Samsung unveiled a new Android XR operating system that will power a new generation of devices, including a mixed-reality headset and smart glasses similar to Meta’s Ray-Bans. The wearables that run on the new Android XR platform will integrate Gemini, Google’s AI-powered assistant to help perform tasks, and offer instant camera translation and live view for maps. And Apple is reportedly joining the fray, too, as the tech giant recently conducted an internal study to workshop ideas for its own smart specs.

This story was first published on Robb Report USA