Google wrapped up its I/O 2025 keynote today with an extensive demo of its Android XR glasses, showcasing Gemini-powered translation, navigation, and messaging features. Although the on-stage showcase was cool, we didn’t really see anything new. You could say I was unimpressed. But then I got to use the glasses for five minutes, and I can’t wait to buy a pair.
I’ve worn the Meta Ray-Ban smart glasses for more than a year now, and while I don’t regret my purchase, they’re primarily used as sunglasses with built-in Bluetooth headphones while walking my dog. The camera actually takes great photos and the AI features could be useful for the right person, but there isn’t much else to them.

Ray-Ban Meta Smart Glasses
Embraced by the next generation of culture makers, its journey continues with AI-enhanced wearable tech. Listen, call, capture, and live stream features are seamlessly integrated within the classic frame.
On paper, Google’s Android XR glasses are very similar. It has a built-in camera, speakers, and can interact with AI. But the moment I put on the glasses and saw the AR display light up, I was prepared to throw out my Meta glasses.
Looking around the Android XR glasses, controls and buttons are nearly identical to other smart glasses. There’s a touch-sensitive area on the right arm to activate or pause Gemini, a camera button on the right temple, speakers located near your ears, and a USB-C charging port in the left arm. Google isn’t sharing specs, but their weight felt equal to or lighter than my Meta Ray-Bans. I expressed my shock when I first put them on and noticed how comfortable they felt.
Google’s choice to use a monocular display was the one thing that threw me for a loop. Instead of projecting the interface on both lenses, there’s a single display in the right lens. This might not bother everyone, but my initial response was to try shifting the frames because something felt off and the interface looked out of focus. I wasn’t seeing double, but it took my brain a moment to adjust.
I was first greeted with the time and weather. It was nothing but white text, but I was impressed by how bright and legible everything was. Of course, I was in a dimly lit room, so it probably would have looked different if I had been outside under the sun. The Google employee giving me the demo then had me touch the right arm to launch Gemini.
The rest of my demo was basically a recreation of what we saw on the I/O stage. Using what’s basically already available through Gemini Live on your smartphone, I looked at several books on a shelf and asked the glasses for information about the author. I then used the shutter button on the frame to take a picture, which was instantly transferred to the employee’s Pixel.
The Googler also launched walking directions, which gave me a moment to try an interface concept new to me. When you’re looking straight ahead, Google Maps shows basic navigation instructions like “Walk north on Main Street.” But if you look down, the display switches over to a map, with an arrow showing your current trajectory. Splitting the two views ensures there’s minimal information floating in front of you as you walk, but if you need more context, it’s a glance away.
It’s this ability to see Gemini’s responses and interact with different applications that makes these smart glasses so impressive. Having a full-color display that’s easy to read also makes Google’s Android XR glasses feel more futuristic than other options from companies like Even Realities that use single-color waveguide technology.
There was no way for me to capture the display during my short demo, so you’ll have to rely on the keynote demo embedded below to get an idea of what I was looking at.
It should be noted that Google’s Android XR glasses currently serve as a tech demo to showcase the in-development operating system, with no word on if the company plans to actually sell them. They’re still pretty buggy, with Gemini having a hard time distinguishing my instructions from background conversations, but I wasn’t expecting perfection. Also, most of the compute appeared to be happening on the paired Pixel handset, which is good for battery life on the glasses, but you’ll notice slight delays while information is transferred between the two devices.
Thankfully, even if Google doesn’t launch “Pixel Glasses,” third-parties like XREAL are already working on their own spectacles. And, of course, Samsung is working on the co-developed Project Moohan AR headset that’s set to launch later this year. But given that developers still don’t have access to Android XR to begin building apps for the OS, I wouldn’t expect Google’s potential offerings to hit until 2027 at the earliest.