MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Thursday, 02 May 2024

Meta’s AI bet with its Ray-Ban smart glasses

The multimodal AI features include performing translations besides identification of objects, animals and monuments

Mathures Paul Published 30.03.24, 07:42 AM
File picture of Meta’s Ray-Ban smart glasses

File picture of Meta’s Ray-Ban smart glasses

Meta is giving its Ray-Ban smart glasses a shot of AI with an upgrade next month. The multimodal AI features include performing translations besides identification of objects, animals and monuments.

The company is restricting AI features to the US for the moment and voice features are only available in English, Italian, and French. The New York Times got early access to the new features and in its report has said that wearers start by saying “Hey, Meta” before giving it a prompt, then receive a computer-generated voice reply through the eyewear’s speakers.

ADVERTISEMENT

Meta is cautious and is aware that its AI may not be able to correctly identify every object at the moment but the features will improve with feedback over time.

Meta is spending billions on Nvidia’s high-in-demand AI chips and it was only a logical step to add AI to its smart glasses. The Ray-Ban Meta glasses, which start at $299, already have features including taking photos and videos, live streaming, and playing music.

One of the problems with smart glasses is the user is seen talking to himself in public spaces. Finding a person speaking to himself in the middle of, say, New Market may make you appear as an oddball.

Follow us on:
ADVERTISEMENT