Google demos its smartglasses and makes us hanker for the future


At a recent TED talk, Google’s exciting XR smartglasses were demonstrated to the public for the very first time. While we’ve seen the smartglasses before, it has always been in highly polished videos showcasing Project Astra, where we never get a true feel for the features and functionality in the real world. All that has now changed, and our first glimpse of the future is very exciting. However, future is very much the operative word. 

The demonstration of what the smartglasses can do takes up the majority of the 16-minute presentation, which is introduced by Google’s vice president of augmented and extended reality Shahram Izadi. He starts out with some background on the project, which features Android XR at its center, the operating system Google is building with Samsung. It brings Google Gemini to XR hardware such as headsets, smartglasses, and “form factors we haven’t even dreamed of yet.”

A pair of smartglasses are used for the demonstration. The design is bold, in that the frames are polished black and “heavy,” much like the Ray-Ban Meta smartglasses. They feature a camera, speaker, and a microphone for the AI to see and hear what’s going on around you, and through a link with your phone you’ll be able to make and receive calls. Where they separate from Ray-Ban Meta is with the addition of a tiny color in-lens display.

Headset and glasses

What makes the Android XR smartglasses initially stand out in the demo is Gemini’s ability to remember what it has “seen,” and it correctly recalls the title of a book the wearer glanced at, and even noted where a hotel keycard had been left. This short-term memory has a wide range of uses, not just as a memory jogger, but as a way to confirm details and better organize time too. 

The AI vision is also used to explain a diagram in a book, and translate text into different languages. It also directly translates spoken languages in real-time. The screen is brought into action when Gemini is asked to navigate to a local beauty spot, where directions are shown on the lens. Gemini reacts quickly to its instructions, and everything appears to work seamlessly during the live demonstration.

Following the smartglasses, Android XR is then shown working on a full headset. The visual experience recalls that of Apple’s Vision Pro headset, with multiple windows shown in front of the wearer and pinch-gestures used to control what’s happening. However, Gemini was the key to using the Android XR headset, with the demonstration showing the AI’s ability to describe and explain what’s being seen or shown in a highly conversational manner. 

When can we buy it?

Izadi closed the presentation saying, “We’re entering an exciting new phase of the computing revolution. Headsets and glasses are just the beginning. All this points to a single vision of the future, a world where helpful AI will converge with lightweight XR. XR devices will become increasingly more wearable, giving us instant access to information. While AI is going to become more contextually aware, more conversational, more personalized, working with us on our terms and in our language. We’re no longer augmenting our reality, but rather augmenting our intelligence.”

It’s tantalizing stuff, and for anyone who saw the potential in Google Glass and have already been enjoying Ray-Ban Meta, the smartglasses in particular certainly appear to be the desirable  next step in the evolution of everyday smart eyewear. However, the emphasis should be on the future, as while the glasses appeared to be almost ready for public release, it may not be the case at all, as Google continues the seemingly endless tease of its smart eyewear.

Izadi didn’t talk about a release date for either XR device during the TED Talk, which isn’t a good sign, so when are they likely to be real products we can buy? The smartglasses demonstrated are said to be a further collaboration between Google and Samsung — the headset is also made by Samsung — and are not expected to launch until 2026, according to a report from The Korean Economic Daily, which extends the possible launch date beyond the end of 2025 as previously rumored. While this may seem a long time away, it’s actually closer in time than the consumer version of Meta’s Orion smartglasses, which aren’t expected to hit stores until late 2027. 

Will it arrive too late? 

Considering the smartglasses shown during the TED Talk seem to bring together aspects of Glass, Ray-Ban Meta, and smartglasses such as those from Halliday, plus the Google Gemini assistant we already use on our phones and computers now, the continued lengthy wait is surprising and frustrating. 

Worse, the overload of hardware using AI, plus the many Ray-Ban Meta copies and alternatives expected between now and the end of 2026 means Google and Samsung’s effort is at risk of becoming old news, or eventually releasing to an incredibly jaded public. The Android XR headset, known as Project Moohan, is likely to launch in 2025.

Perhaps we’re just being impatient, but when we see a demo featuring a product that looks so final, and tantalizing, it’s hard not to want it in our hands (or on our faces) sooner than some time next year. 








Source link

Previous articleSPAR Switzerland Now Accepts Bitcoin Payments: A Major Step for Crypto Adoption | Flash News Detail
Next articleHow do they stack up for investors?