Meta Ray-Ban Smart Glasses
pros and cons
- Approachable form factor
- Immersive, natural-sounding speakers
- Multimodal, AI-powered voice and camera features
- Flexible lens options
- Meta AI performs inconsistently (for now)
- Camera sensor is not reliable in low-light environments
more buying choices
This review was originally published on October 17, 2023, and was updated on March 29, 2024.
ZDNET’s buying advice
Meta’s $299 smart glasses, made in partnership with Ray-Ban, are exactly that: a premium wearable that can do “smart” things, from capturing photos and videos when voice-prompted; to playing music and podcasts via your phone; to answering your most curious questions with a built-in Meta AI.
For many, including myself, that makes the Meta Ray-Bans a seamless lifestyle fit — one that is less awkward than, say, the company’s other on-your-face gadget, the Quest 3. Instead of applying digital overlays over the real world like the infamously tragic Google Glass, the Meta Ray-Bans focus on the essentials, features that should entice a more mainstream audience, including folks who have already become accustomed to paying $200 and more for prescriptions and shades.
If you’re considering buying a pair, I’d recommend visiting a local Ray-Ban, Best Buy, or optical store and testing the glasses for yourself. They’re available in two styles (Wayfarer and Headliner), three lens types (Clear, Shades, and Transitions), and two sizes (small and large), so there’s a good chance you’ll end up liking them as much as I do.
Of all the smart glasses I’ve tested, these are easily the best for content capturing and audio listening. And with more time, practice, and user insights, the Meta Ray-Bans may also become the best for AI.
Specifications
Design |
Wayfarer and Headliner, in Clear, Shades, or Transitions |
Camera |
12MP ultra-wide (1080p at 30fps) |
Microphone |
Five-mic setup with immersive audio recording |
Weight |
133 grams with charging case |
Storage |
32GB |
Battery |
Up to 36 hours of use with fully charged case |
Connectivity |
Wi-Fi 6, Bluetooth 5.2, USB-C |
How I tested the Meta Ray-Ban Smart Glasses
Since my initial review of the Meta Ray-Bans, I’ve been wearing a pair with prescribed lenses on and off, leveraging the built-in camera, speakers, and voice assistant for work and play. My original glasses were also a pair from Ray-Bans that looked similar to Meta’s but minus all the tech smarts. That made my transition to the Meta alternative fairly easy. Folks who normally wear sunglasses may and should expect to feel the same way.
I’ve worn the Meta Ray-Bans to both CES and MWC (read: I’ve gone through the awkwardness that is walking past TSA security with “spy cameras” on) and have been using the hands-free, video-recording functionality of the glasses to capture product videos, relive experiences, and more. I even used them to document my experience demoing and picking up the Apple Vision Pro on launch day.
Lastly, I’m enrolled in Meta’s Early Access Program, which has allowed me to preview and test features still in development, such as multimodal AI for translating and distinguishing animal species and monuments.
What are the Meta Ray-Ban Smart Glasses’ best features?
A form factor that won’t scare people away: Even with the cameras, speakers, and various modules tucked beneath the frame — all of which are made visible if you sport the transparent finish that Meta offers — the smart glasses are surprisingly lightweight, don’t induce as much fatigue as other high-tech wearables, and, dare I say, feel normal to wear.
Also: Meta’s Quest 3 will also get ‘lying down mode’ – here’s why it’s taking longer
As I mentioned in my buying advice, sizing means everything for these glasses, so I’d sample both the small and large frames before putting any money down. I’ll note that both options come with the same charging case and estimated battery life, so you won’t compromise the performance or portability by opting for one over the other.
The easiest “action camera” you’ll ever use: Other key improvements with the Meta Ray-Bans include a 12-megapixel ultra-wide camera that’s capable of capturing sharper photos at 3024 x 4032 pixels and 1080p videos at 1440 x 1920 resolution. The output won’t put your GoPro or other action camera in retirement — the dynamic range falls short for me — but the ease of recording relatively smooth video without having to hold or mount anything can spoil anyone.
When capturing any content with the glasses, I’d advise tilting your head down a little, since the actual camera sensor is higher than your eyes. Recording from my height (6 ft.), subjects can often appear off-centered or be closer to the bottom half of videos, so it’s something to be mindful of.
Both photo and video formats are scaled for portrait capturing, as the ideal use case for the glasses is vertical content sharing on Meta’s social platforms, such as Facebook and Instagram. That content-sharing capability includes live streaming, which you can now start up with a few taps on the wearable.
Also: I streamed with Logitech’s Mevo Core camera and it nearly beat out my $3,600 Canon
When you’re on the live-streaming Meta platform of choice, a camera button magically appears for you to switch from your phone’s camera to your smart glasses. While the capability is more geared towards influencers and content creators, I’ve found the general focus on vertical video beneficial for when I’m recording hands-on product demos and other short-form content for ZDNET’s social pages.
Audio is the secret weapon: To assist with the focus on video, Meta equipped the glasses with five microphones, one of which is cleverly tucked in the nose bridge for the most optimal voice recording. Based on audio samples, including several indoor and outdoor conversations with ZDNET’s Jason Hiner, who’s also been testing a pair, it’s safe to say that while the Meta Ray-Bans can replace your smartphone’s microphone for calls, ambient noise and wind will reduce the clarity by a noticeable amount.
The other mics on the glasses are scattered across the front and sides and can now be used to record 360 audio. It’s almost as if Meta is gearing our senses for a future where spatial videos are more prevalent, which they likely will be.
Also: How to capture spatial video with the iPhone 15 Pro (there’s a trick)
For playback, these glasses sound better than their predecessor, the Meta Ray-Ban Stories, that I tested two years ago, leaning more on higher frequencies to produce clear voices and instruments versus bass and lower pitches. This makes the successor ideal for listening to podcasts and pop music more than electronic and hip-hop.
Meta tells me that the one key area of improvement is a new directional output that greatly reduces sound leakage. But I can’t lie when I say I still felt guilty listening at higher volumes in public. The grim stares have made it clear that the sound isolation isn’t perfect.
Privacy features: It’s worth pointing out that for the sake of people’s privacy, Meta made the blinking animation of the LED indicator a lot more noticeable when the glasses are recording. The glasses also won’t record at all if they detect anything covering the LED indicator, which everyone can appreciate.
While a Meta account is required to use the glasses, it’s optional for you to share your usage data and information with the company.
A superior charging case: I’d like to give a nod to the smart glasses’ new charging/carrying case. It still serves as a wireless charger when the glasses are slotted in, but it’s also significantly slimmer than the previous version, with a leather booklet style instead of the hard-cushioned, snap-on capsule. Meta says it’s 32% lighter than the last model, which I believe, and gives the glasses eight full charges or 32 hours of additional battery life.
What I’d like to see in the next model
Meta AI, but better: Meta AI, the glasses’ built-in chatbot, is what truly makes these smart wearables “smart.” While the Llama 2-powered assistant nails the essentials like weather forecasting, playing a specific song or episode of a podcast, and capturing photos and videos when asked, there’s room for improvement.
Also: After Quest 3 success, Meta’s first true AR glasses to be revealed this year
Conversing with Meta AI is much like the early stages with ChatGPT. You’re encouraged to ask more descriptive questions, hold longer conversations, and even follow up at times, but I’m not so sure if the AI is capable of going into more details yet. For example, when asking for a dinner recipe, the “elaborated” answer that the chatbot provided was no longer than the “brief” answer; the only difference was adding my preference for seasoning.
There’s also the issue with response playback — specifically, how the AI will read a long answer without pausing, leaving you in the dust as you try to scurry through steps one and two for, say, a recipe. If there was an ability to pause responses, whether through the gesture pad or telling Meta AI to “wait,” that would make the experience flow much more naturally. (For something more ambitious, I’d love the glasses to be able to livestream my actions as it’s giving me prompts, harmoniously moving to next steps as I progress.)
More flexible video recording: From extending the recording limit past 60 seconds to editing software in the Meta View app that allows you to straighten the field of view, there are plenty of ways to improve the Meta Ray-Bans without changing the hardware. While the existing recording limit may be in place to prevent overheating and preserve battery life, I’ve often found myself repeatedly clicking the record button in hopes of capturing every part of an experience. It’s almost as if there’s a feeling of FOMO.
Final thought
Meta recently announced that its multimodal AI features will soon be released publically to Meta Ray-Bans, and I’m cautiously optimistic about it. The text translation feature should come in handy for travelers — so long as you’re working with English, Spanish, Italian, French, and German — and the object, animal, and monument identification capability will be just as useful for the curious-minded.
From what I’ve seen in beta, such AI features are mostly reliable, with the glasses being able to name the Verrazano Bridge that I was about to drive across this weekend and successfully identifying the “domesticated orange tabby cat” who meanders across my backyard every morning, so it’s a promising start.