|
Meta will finally let people try out its flashiest AI features for Meta Ray-Ban smart glasses, albeit to start in an early access trial. Today, Meta announced that it will begin rolling out its multimodal AI features that can tell you about things that Meta's AI assistant can see and hear through the glasses' camera and microphones. Mark Zuckerberg demonstrated the update in an Instagram reel where he asked the glasses to suggest pants that matched the shirt he was holding.
He responded by describing the shirt and offering a Europe Mobile Number List couple of suggestions for pants that might complement it. She also had the glasses' AI assistant translate text and display a couple of image captions. Zuckerberg revealed the multi-modal AI features for Ray-Ban glasses like this in an interview with The Verge's Alex Heath in a September Decipher interview. Zuckerberg said people would talk to the Meta AI assistant “throughout the day about different questions they had,” suggesting it could answer questions about what users are looking at or where they are.

The AI assistant also accurately described a California-shaped illuminated wall sculpture in a video by CTO Andrew Bosworth. He explained some of the other features, which include asking the assistant to help you caption the photos you've taken or requesting translation and summarization, all fairly common AI features seen in other Microsoft and Google products. The trial period will be limited in the United States to “a small number of people who choose to participate,” Bosworth said. Instructions for subscribing can be found here.
|
|
|
|
|
|
|