Meta Ray-Ban smart glasses introduce AI to recognize objects and translate languages

Metacompanyup to dateannounced that it willRay-BanSmart glassesLaunching a compelling multi-modalAI Features, providing users with a smarter and more interactive experience. This feature uses the glasses' cameras and microphones to enable Meta's AI assistant to perceive the audio-visual information around the user and respond accordingly.

Mark Zuckerberg demonstrated the update in a video on Instagram. He asked the smart glasses for suggestions on what to wear with a shirt, and the smart glasses AI assistant responded by describing the shirt and suggesting several possible pants to go with it. He also demonstrated the glasses' AI assistant's ability to translate text and caption images.

Zuckerberg said that users can ask Meta's AI assistant questions at different times of the day, involving questions about where the user is looking or where they are. This shows that the multimodal AI function of this smart glasses is not limited to object recognition, but also includes language translation, image description and other applications.

In addition to the above features, CTO Andrew Bosworth demonstrated other features in a video, including asking the assistant to help annotate photos taken by users, as well as common AI functions such as translation and summary. Bosworth demonstrated a new feature of the multimodal version in Instagram, wearing glasses and gazing at a wall art showing the state of California. Interestingly, he also seemed to be holding a smartphone, which suggests that artificial intelligence may need to be paired with glasses.

Meta Ray-Ban smart glasses introduce AI to recognize objects and translate languages

Bosworth said that starting next week, they will conduct beta testing in the United States, aiming to let users participate in the testing through an early access program, although he did not mention how to participate in the testing program in his post.

Smart glassesup to dateThe current model already has a built-in AI assistant, but its capabilities are relatively limited, and it can't intelligently respond to video or photography, let alone respond to real-time images of what the wearer is seeing (even though the glasses have a built-in camera).

This move marks Meta's commitment to integrating its AI technology into more hardware products to provide users with a smarter and more personalized experience. For the smart glasses market, this step may create more diversified application scenarios and bring users a more convenient and intelligent life experience.

statement:The content is collected from various media platforms such as public websites. If the included content infringes on your rights, please contact us by email and we will deal with it as soon as possible.
Information

OpenAI's nonprofit arm made less than $45,000 in net income last year

2023-12-13 9:49:45

Information

Snapchat launches AI-generated photo feature, Plus members can customize sharing

2023-12-13 9:51:46

Search