Meta to add new AI tricks to its Ray-Ban smart glasses

Among other improvements, Meta AI will be able to set a reminder to buy whatever the camera is pointed at.

The Ray-Ban Meta glasses are the first real success of an AI-powered wearable . They have the luxury brand’s sleek styling, which means they don’t look as ridiculous as some of the bulkier, heavier attempts at mixed reality headsets. Meta AI, the built-in AI agent, can answer questions and even identify what you’re looking at using the added cameras. People also love using voice commands to capture photos and videos of whatever’s in front of them without necessarily having to pull out their phone.

Meta smart glasses will soon feature more AI-powered voice features. Meta CEO Mark Zuckerberg announced the latest software updates for the smart glasses at its Meta Connect 2024 event yesterday.

“The reality is that most of the time you don’t use smart features, so people want to have something on their face that they’re proud of, that looks good and is designed in a really nice way. We keep updating the software and evolving the ecosystem, and they’re just getting smarter and able to do more,” Zuckerberg said at Connect.

The giant also used the event to announce its new Meta Quest 3S , a more affordable version of its mixed reality headset. It also unveiled a number of other AI capabilities across its various platforms, with new features being added to Meta AI and its great language model, Llama .

What Ray-Ban Meta can do for you

As for Ray-Ban , Meta is doing its best not to spoil its huge success. Its smart glasses received an infusion of AI technology earlier this year, and now Meta is adding more battery capabilities, though the improvements are fairly minimal. You can already ask Meta AI a question and hear its answers from speakers built into the arms of the frames.

Probably the most impressive is the ability to set reminders. You can look at something while wearing the glasses and say, “Hey, remind me to buy this book next week,” and the glasses will understand what book it is and send you a reminder. Within a week, Meta AI will tell you it’s time to buy that book.

Meta says that live transcription services are coming to the glasses soon, meaning that people speaking in different languages ​​could have their speech transcribed on the fly, or at least in a timely manner. It’s unclear how well this will work, given that the 
Meta glasses previous written translation capabilities have proven to be either a glorious success or a dismal failure.

Zuckerberg says he’s also 
partnered with Danish mobile app Be My Eyes to add a feature to the Ray-Ban Meta that connects blind and low vision people with volunteers who can watch a live video feed and walk the wearer through what’s in front of them: “I think it’s not only going to be a pretty impressive experience today, but it’s a glimpse of the kinds of things that could be more possible with always-on AI.”

New frame and lens colors are being added, and customers now have the option to add transition lenses that increase or decrease their shading based on the current level of sunlight.

The company hasn’t said exactly when these additional AI features will arrive on its Ray-Bans , except that they’ll arrive sometime this year. With just three months of 2024 left, that means “very so

Exit mobile version