Meta rolls out live AI, translations, and Shazam to its smart glasses
Meta has introduced three new features for its Ray-Ban smart glasses: live AI, live translations, and Shazam integration. These updates are designed to make the glasses more useful and interactive, showcasing the role of AI in wearable tech.
Live AI lets users talk naturally to Meta’s AI assistant while it observes their surroundings. For example, at a grocery store, you could ask the assistant to suggest recipes based on the items you’re looking at. This feature, available only to members of Meta’s Early Access Program, can run for about 30 minutes at full charge.
Live translations offer real-time language conversion between English and Spanish, French, or Italian. You can hear the translations through the glasses or view them as text on your phone. To use this, you must download the required language pairs and specify which languages are being spoken. Like live AI, this feature is also limited to early-access members.
Shazam integration is currently available in the US and Canada. Simply ask Meta’s AI to identify a song playing nearby, and it will tell you the name of the track.
To access the above features, your glasses must be updated to version 11 software, and the Meta View app must be on version 196. If you’re interested in trying the Early Access features, you can apply online.
These updates come as tech companies increasingly focus on AI-powered glasses. Meta’s CTO, Andrew Bosworth, recently called 2024 a breakthrough year for AI glasses, which he sees as a device category built entirely around artificial intelligence. This move follows Meta’s earlier teasers at Meta Connect 2024 and reflects the industry’s push to make AI assistants central to smart glasses.
source: TheVerve.com