top of page
Search

Ray-Ban Meta Glasses: A Step Into the Future, But Which One?

  • Michael Shmilov
  • Apr 13
  • 2 min read

Ray-Ban's Meta AI glasses
Ray-Ban's Meta AI glasses

I’ve wanted the Ray-Ban Meta glasses for a while now, and during a recent family trip to Paris—somewhere between croissants and waiting in line at Disneyland—I finally picked up a pair. (Disney, by the way, is a great example of a different kind of wide moat business. But that’s another story.)


I was excited. I love playing with new tech, especially when it feels like it could be part of something bigger. And on paper, Meta’s smart glasses look like a bold step toward their vision of leading the next platform shift—toward wearables, ambient computing, and eventually, AR.


But after using them for a bit, something clicked—just not in the way I expected.


Not a New Platform, Just a Better Feeder

I initially thought of these glasses as Meta’s move toward “owning” the next platform. But the more I used them, the more it became clear: these aren’t enabling a new platform. Not yet. They’re designed to feed the existing ones.


They make it easier to take photos and videos without pulling out your phone, and to share that content back to Meta’s apps—Instagram, Facebook, WhatsApp. A frictionless funnel back into the existing ecosystem. Smart? Yes. Revolutionary? Not quite.


What They Are

That said, this is a solid wearable. I actually enjoy using it to listen to podcasts. The speakers are subtle, and you still feel present. I also like hearing my messages without digging into my phone. And the AI assistant? It’s basic, but you can see where it could go—point at something, snap a photo, and get a real-time explanation or translation. That’s promising.


They aren’t AR glasses. There’s no overlay, no spatial interface. But maybe someday, AI’s best output won’t be on a screen—it’ll be in the real world, through tools like this.


A Case of Nostalgia UX

The case deserves a quick mention. It’s beautifully designed—looks like regular Ray-Bans. But pulling the glasses out is oddly awkward. It reminds me of the early iOS days, when Apple’s Notes app looked like a yellow legal pad. We were still copying the real world into digital form, even when it wasn’t the best fit. That’s what’s happening here. Familiarity is nice, but new products deserve new interactions.

Ray-Ban's Meta Case
Ray-Ban's Meta Case

So What’s the Moat?

Before using them, I thought: “This is Meta widening its moat.” But now? I’m not so sure. It’s not the kind of product people rush to buy, like the early iPhone. Not yet, at least. And it’s not obviously reinforcing Meta’s network effects in the way their platforms or ad engines do.


It’s interesting. It has potential. But I wouldn’t call it moat-widening—more like moat-supporting. A tool that helps keep users inside the ecosystem a bit more, but not something that shifts the game in Meta’s favor overnight. You can read more about Meta's Moat on my EcoMoat project.


Can Meta Deliver?

The big question is whether Meta can evolve this into something more ambitious. As a product, it’s already useful. As a strategy, it’s a slow burn. I’ll keep using it and watching what comes next.


In the meantime, I’ll be trying to get the glasses out of their case a little more gracefully.

 
 
bottom of page