Meta to add AI to its Smart Glasses in April
Meta, the parent company of Facebook, is set to enhance its Ray-Ban collaboration smart glasses with a significant update slated for April. Among the array of features already available in the Ray-Ban Meta glasses, including photo and video capture, live streaming, and music playback starting at $300, Meta is introducing the Meta AI feature, enabling users to interact with their glasses through voice commands.
With the addition of Meta AI, users can seek real-time answers and recommendations tailored to their surroundings. For instance, wearers can ask the smart glasses to identify objects, translate languages, or provide suggestions based on photos captured with the glasses. By initiating a conversation with "Hey, Meta," users can prompt the glasses and receive responses via computer-generated voice through the glasses' speakers.
While the AI features will initially roll out in the U.S., voice functionalities will be limited to English, Italian, and French, according to Meta. Acknowledging that the AI features are new, Meta anticipates occasional inaccuracies and commits to refining the capabilities based on user feedback over time.
The integration of AI capabilities into its smart glasses represents Meta's endeavour to assert itself in the competitive AI landscape. Having invested significantly in Nvidia's AI chips, Meta's CEO, Mark Zuckerberg, reportedly seeks to recruit talent from Google's DeepMind division to bolster the company's AI initiatives. Moreover, Meta has adapted its hiring practices, offering positions without traditional interviews and revising its salary policies to retain top talent.
Notably, Meta's technology roadmap for the next few years includes the development of an AI model geared towards enhancing video recommendations and user Feeds, underscoring the company's commitment to leveraging AI across its platforms.