Pros
-
Double the battery life of previous glasses
-
Improved video quality
-
No change to size, can swap lens from older model
Cons
-
Higher price
-
Still no landscape photo or video mode
-
AI features are still a mixed bag
Editor’s Note: The previous-generation Ray-Ban Meta glasses were already CNET’s favorite smart glasses by a wide margin. The Ray-Ban Meta Gen 2 smart glasses, while mostly similar in design and AI functions, add far better battery life that doubles the daily use. The camera quality for videos is improved, too. For those reasons, this earns CNET’s Editor’s Choice for 2025. I think it’s well worth the extra $80 for the improved features if you’re considering a pair. The original review continues below.
I stared at a flower outside my hotel near Meta’s campus and asked my new Ray-Ban Meta Gen 2 glasses to identify the species. I got multiple answers. Each time I asked if Meta was sure about that flower, its response changed. Eventually, the AI embedded in the glasses admitted that, yes, it was being unreliable. On the plus side, at least I don’t have to worry as much about battery life now.
Smart glasses are better than they’ve ever been, thanks to Meta. They’re not perfect, not by a long shot, but I don’t expect perfection. I just want a pair of smart glassesthat’ll last most of the day before needing a recharge. At $379, the second-gen Ray-Ban Meta glasses are my go-to choice, and a clear upgrade over the still-available$299 first-gen model. Double the battery life is more than worth it, and it has a massive impact on how functional these glasses feel.
The Gen 2 Meta Ray-Bans look the same, but have a boosted battery and camera inside.
What’s changed at the higher $379 price is battery life and camera quality, with battery life being the biggest improvement. My 2-year-old Ray-Bans barely lasted a few hours on a charge, but the new models run anywhere from 4 to 12 hours, depending on use.
One day at Meta, the battery lasted from 8 a.m. to nearly 9 p.m. with occasional AI prompts, photos, videos, some music and phone calls. Another day, on a nonstop run to the airport with music and podcasts playing, it lasted from 9 a.m. breakfast to my 1 p.m. flight. Results varied day to day, but I’m no longer in the same battery-life panic with my glasses that I used to be.
Audio on the glasses is controlled, once again, either by voice or via the touchpad on the side.
Audio is great unless you’re in noisy areas
I’m still impressed by the Ray-Bans when it comes to listening to music and making phone calls. The tiny speakers embedded in the frame sound ambient, natural, and surprisingly loud. The built-in array of five microphones — the same as before — is fantastic for phone calls; no one ever realizes I’m speaking from glasses. Voices and podcasts, in particular, come through sharp and clear.
And yet, even with an automatic volume-adjusting mode for noisier environments, there’s only so much open-air speakers and mics can handle. Noise-canceling earbuds easily outperform these glasses in public or on a plane, but there’s serious convenience in not having to fish out earbuds.
The physical controls remain the same: You can use voice or the touchpad on the right arm to play music or podcasts or take calls, but I find I trigger that touchpad too easily sometimes. Still, it feels both magical and strange to wander around with my own personal ambient soundtrack and no visible earbuds, even if my wife and kids can hear the music a bit, too — there’s some audio bleed since the design is open-ear.
Camera: 3K video and stabilization, with slo-mo mode to come
As usual, I’ve been taking a lot of photos on the new Ray-Bans. I often use them as tiny snapshots for my memory. What did that menu say? What did those jars of jam have in them? Where did I park?
Debating Meta AI in Live AI mode as my son holds up a stuffed beige rabbit.
Meta calls that long-term vision “contextual AI,” and right now, it still needs a lot of work.
While these glasses can describe your surroundings or offer supposedly helpful commentary by snapping a photo and analyzing it, the range of responses is unpredictable. Sometimes Meta is accurate; other times it just makes things up. Most days, I find myself having existential arguments with the on-glasses AI voice of Judi Dench (one of several voices you can choose from) about things like the stuffed animals my son is holding up on the sofa. A brief snippet of our chats appears on the right.
Meta’s glasses also have some wonderfully interesting and even helpful assistive elements. They can describe what’s in front of you by snapping a photo. There’s also a Live AI mode that continuously uses the glasses’ video feed, but it drains the battery more quickly.
They can read a page of a book right in front of you or translate text into another supported language — currently French, Italian, German, Spanish and Portuguese. Plus, they can do live translation, much like Apple’s AirPod Pros and Google’s Translate app.
I know people who use the glasses’ AI vision features to help with vision impairment, and Meta also partners with Be My Eyesa volunteer service that can access your glasses’ camera feed and audio to assist you remotely. There’s also a more detailed AI mode for vision impairment that provides richer descriptions to aid with navigation. But the glasses sometimes fail at their task, overgeneralize or misunderstand — and Meta itself warns about inaccuracies in the fine print.
Later this year, Meta is rolling out a fascinating “conversation focus” feature for the glasses, designed to tune out other voices in a room and zero in on whoever you’re looking at using the beam-forming microphones. For now, though, I still find the glasses mostly unaware of what I’m doing. I can ask for a photo to be snapped and analyzed, or restart Live AI, but that’s about it.
Meta needs more AI hooks to other apps
Another issue is that the glasses don’t work with many other apps. The Meta AI can hook in to Apple Music, Amazon Music, Spotify and iHeartRadio to play music, or use Shazam. Phone calls and texts can also be received, you can manage Google calendar appointments and the glasses can handle video calls and messages with WhatsApp, Facebook, Instagram and Facebook Messenger, but that’s it for now. All the other functions and apps on your phone are inaccessible. I can’t search for a file or send an email or check an iMessage, for example.
I look totally like myself in these glasses. You might not even notice the camera.
The ones to get if you’re interested in smart glasses now
Meta’s glasses, for all their unfinished pieces, are still the best on the market by far. The improved battery life this time around is a big step up, and I’ll definitely be wearing these more often. I’m not the sporty type, but if you are, it’s worth noting that Meta’s Oakley HTSN glasses offer similar battery life to these second-gen Ray-Bans.
I’d get these over the Ray-Ban Displayswhich I haven’t even reviewed yet, just because they’re more affordable and simply functional. The Displays have a new interface and emerging tech that could take a year or more to really develop. But the second-gen Ray-Bans are excellent now.
Excellent, but not perfect. Google is coming out with its own AI camera and audio glasses soon, maybe as early as 2026 with Warby Parker and other eyewear partnerships. Google’s glasses should connect to a wider range of Google apps and services, although it’s still unclear. But others are also cominginto this space, too.
At least these Ray-Bans still don’t cost an arm and a leg, and they’re going to improve over time. Do you want Meta on your face? That’s the other big question, especially when it comes to AI and data privacy and Meta’s own policies on AI and content moderation. You’re in Meta’s world with these Ray-Bans, but it’s not intruding too hard on yours, yet. For now, at least.