Meta’s long-promised AI-enabled vision features for the company’s Ray-Ban Smart Glasses are finally incoming, allowing the onboard AI to see, hear, and elucidate its environment through the glasses’ 12 MP camera and microphone. The company proclaimed late Tuesday that these new AI-vision features are now available in beta. There’s a whole lot of utility for a mobile AI that can see and hear, but leave it to Meta’s leadership to show off the new feature in the most awkward and trite way possible.
Meta CEO and frontman Mark Zuckerberg isn’t exactly known for his style. There’s nothing patented about his choice of attire compared to Steve Jobs’ black turtlenecks or Sam Bankman-Fried’s daily getup shorts and sleep shirt (though soon SBF might be rocking the classic orange prison jumpsuit). On Tuesday evening, Zuckerberg decided the best way to show off the Smart Glasses’ upcoming AI capabilities was to take his pair into his cluttered closet. He pulled out a large navy blue polo shirt streaked with throwback rainbow stripes.
“Hey Meta, look and tell me what pants to wear with this shirt,” Zuckerberg asked his glasses in what may be the most asinine question to ask vision-enabled AI since ChatGPT rolled out a similar feature back in September. Meta’s chatbot responded with a robotic “It seems to be a striped shirt,” then recommended dark-washed jeans or solid color pants. If I were the one holding that gaudy, vintage polo and wondering what the hell goes with rainbow stripes, I might find the AI’s answer incredibly unhelpful.
Zuckerberg also posted another video to his Instagram showing how the AI could translate a meme from Spanish to English. For the moment, the AI voice still sounds stilted and robotic, but the company implemented celebrity-voiced AI personas for users to chat with inside Facebook Messenger and WhatsApp, so more than likely, Meta will try and eventually make the feature seem more human.
Perhaps the most useful feature addition for the Ray-Ban glasses is that it can access real-time information through Bing AI. Meta said real-time seek will be rolling out in phases to all U.S. users in the near future. Gizmodo has not had the opportunity to assess out these new features, though Meta offered a few more likely uses for the vision-enabled AI, such as “writing a caption for a photo taken during a hike” or asking it to describe an object you’re holding. There are clearly better uses for vision-based AI, such as real-world narration for blind or low-vision users, but Meta seems to be keeping it small for this first beta rollout.
The company first talked up this AI integration during its last Meta Connect conference, where it also showed off its Quest 3 VR headset. Until now, the onboard microphone would pick up on simple commands, and you could use it to talk up Meta’s conversational AI chatbot. Now, I’m not known for my drip, not by a long shot, so I can’t comment too hard on Zuck’s style choices, but I also have a notion that the AI’s vague style advice won’t be too helpful for me, nor Zuck himself.