XR Is Going Mainstream. Does That Mean We’ll All Be Glasses Wearers Now?

zeeforce
8 Min Read


I’m standing in the lobby of a hotel in Hawaii, gazing into the glaring sun through the lens of Snap AR Spectacles and wondering if this is my future. 

The glasses are an updated version of the ones I tried out last year at the Snapdragon Summit in Hawaii. Rather than playing with Moo Deng — a fun little novelty — I’m using them for things I do every day on my phone: Browsing the internet and scrolling through social videos.

Right here is evidence of Snap bringing productivity and genuinely useful features to its glasses, which are still a clunky developer version for now, before it eventually turns Spectacles into a bona fide consumer device. Like my colleague Scott Stein, who tried the Spectacles out several weeks ago, I’m most impressed with the AI-powered live translation feature. It allows me to see my conversation partner’s words translated into French subtitles in real time just below her face, making it easy for us to converse naturally without breaking eye contact.

Watch this: Everything Announced at the Snapdragon Summit 2025 Keynote in 21 Minutes

To me, the progression is clear: The Spectacles seem to be growing up, taking themselves more seriously and finding their true purpose. This is part of a wider trend of XR devices (mixed reality), which feel as though they’re on the cusp of having a major mainstream breakthrough.

Trace the idea of XR back to its earliest days, and you will find clunky virtual reality headsets that were groundbreaking in their time and showed us a vision of wearable screens that ultimately didn’t lead to mainstream adoption. Even Apple’s much-hyped mixed-reality headset, the Vision Pro, has struggled to establish broad appeal beyond the pros and rich bros. 

But many people in tech think we’re about to hit a watershed moment for XR.

“The time [for XR] is now,” said Rick Osterloh, SVP of devices and services at Google, speaking at the Snapdragon Summit. “The technology’s ready and a bunch of products are going to really change the user experience.”

Google has been working on XR products for a long time, said Osterloh, but the combination of underlying silicon, such as Qualcomm’s chips, and AI breakthroughs means the tech “is now ready to be able to create a new, brand new computing experience that’s really powerful.”

The concept of a breakthrough moment for XR doesn’t appear to be just wishful thinking either. Sales volumes of Meta’s Rayban glasses, also powered by Qualcomm, have increased more than 12x from the end of last year to now, Alex Katouzian, Qualcomm’s group general manager for mobile, compute and XR told me.

“It’s like massive amounts… the traction on it is really good,” he said. “And then the China customers are coming out with glasses, after glasses, after glasses. Xiaomi is doing a really good job.”

AI supercharging XR

Holding a pair of Snap Spectacles AR glasses up and showing the lenses

Snap’s spectacles are designed entirely to overlay 3D experiences into your physical space. The lenses are transparent, but the effect feels more like VR-based mixed reality.

Scott Stein/CNET

Other than Snapdragon chips, there’s one technology that seems to be igniting the XR product category. “AI is breathing life into it,” said Katouzian.

As I discovered in my Snap Spectacles demo, AI can elevate an XR experience to make it feel truly immersive and seamless. The combination of sensors that could pick up my conversation partner’s speech and visualize where she was standing, along with the AI that could translate her speech, made me understand, perhaps for the first time in all my years of demoing this technology, why I might choose to wear smart glasses even though I’m not a glasses wearer.

Live translation has almost become a litmus test for consumer AI applications over the past year, including earlier this month, as Apple launched the AirPods Pro 3. It tackles an obvious communication challenge and is practical enough that people can easily take advantage of, said Dino Bekis, Qualcomm’s VP of wearables, in an interview.

XR and glasses, in particular, feel like a natural lens through which people can interact with AI, said Bekis. “It’s the same way we interface with the world,” he said. “It sees what you see. It can hear what you hear.”

For Bekis, XR’s breakthrough moment is due to a combination of factors — the quality of AI agent capability, connectivity and the ability to make very small power-sensitive devices.

“We’re just now getting to a point where embedded displays and all these things are starting to happen in a way that actually can translate into real, meaningful, personal devices,” he said. “It’s the beginning.”

A gallery of video captures in Snap AR OS 2.0

Snap’s video of overlaying a gallery of video into a room with Snap OS 2.0 — except on actual Spectacles, the field of view is a narrow fraction of this.

Snap

But what if you’re like me: Hesitant about wearing glasses for comfort reasons? Bekis told me that we’re actually similar in this respect. It might feel unnatural for some people, he acknowledged, in which case they might opt out but choose to have other wearables instead that can still provide crucial sensor data that can contribute to an immersive AI experience.

People should choose the form factors that feel natural to them, he added. From there, it’s the job of the tech companies to make everything work together, regardless of which choices people make.

“It’s not just really about the glasses as much as [it’s] also about this collection of devices that you’re carrying around on your person on a regular basis — the ability for these different devices to interact, share some of this sensory information and then be able to then pull that together in an interesting way for you to digest,” he said.

The jury’s still out for me on whether I’d be willing to embrace XR by adopting glasses as so many people around the world seem to be doing. But in the meantime, I like the idea that XR could be something I dabble in for specific experiences, while I let my watch, my earbuds, my phone and whatever other wearable devices might emerge in the near future do the heavy lifting.





Source link

Share This Article
Leave a comment
Optimized by Optimole
Verified by MonsterInsights