Smart glasses are going to be popular, just not yet




If you pay even a cursory amount of attention to consumer technology, I think you can see that, good or bad, artificial intelligence has helped reignite interest in gadgets. The possibility of interfaces you interact with using natural language has made many people reconsider the smartphones, tablets, and laptops we use every day and revisit ideas previously thought impossible.

Google Glass was an overly ambitious attempt to create the first accessible augmented reality glasses and, aside from enterprise use-cases, it’s been considered a total failure. The glasses looked weird, and didn’t do many of the things Google imagined they would. Brilliant Labs, the creator of the Frame AI glasses, have created an early proof of concept for what a Google Glass-style device could look like in 2024, and it’s made me even more confident these kinds of smart glasses could become commonplace in the next five years.


When I spoke to Bobak Tavanger, Brilliant Labs’ CEO and co-founder, he emphasized that Frame was supposed to be a more presentable version of the company’s early experiments, a sort of minimum viable product for the AR and AI experiences you could enable today. Frame is also a proudly open-source device that’s perfect for developers to tinker with. Having used the Frame, I can say it’s not a particularly consumer-friendly product, but it does suggest that smart glasses people will want to use could be here sooner rather than later, and they don’t have to look all that different from the glasses and sunglasses we use every day.

The clear Brilliant Labs Frame AI glasses on a white background.

Brilliant Labs Frame

The Brilliant Labs Frame are smart glasses that cover all of your AR and AI basics.

Pros

  • Look like normal prescription lenses (for the most part)
  • AI is easy to set up and tweak
  • Computer vision at eye-level just makes sense
Cons

  • Lacks polish on the fit and finish
  • AI might not be useful to everyone

Related

How we test and review products at Pocket-lint

We don’t do arm-chair research. We buy and test our own products, and we only publish buyer’s guides with products we’ve actually reviewed.

Price, availability, and specs

The Brilliant Labs Frame glasses cost $349 and can technically be fitted with prescription lenses if you’re willing to pay more. The glasses are available to order directly from Brilliant Labs’ website, and currently come in three colors: black, grayish-blue, and clear. Unfortunately, though, they currently take over a month to ship if you want them.


In terms of price, the Frame is technically more expensive than the Ray-Ban Meta smart glasses which start at $329, though outside of interactions with AI and an onboard camera, they offer pretty different experiences. Meta’s glasses have no display. They’re also noticeably thicker than the Frame. Brilliant Labs’ glasses also have a display — a tiny MicroOLED that gets reflected through a geometric prism into your eye.

As the company has shared with Pocket-lint, opting for a prism instead of the more expensive holographic waveguide lenses that many companies assume will be in future AR glasses makes the whole package much more affordable. The smart glasses Brilliant Labs made feel like what’s possible today, but even a compromised device isn’t without its charms.


Build and design

The form factor is lightweight and instantly recognizable

The Frame glasses mounted on a wooden nose.

A huge thing working in the Frame’s favor is that Brilliant Labs has designed what in my book is a very cool-looking pair of glasses. The clear version I have doesn’t hide the fact it has a built-in camera and other electronics, but from a distance I don’t think you’d necessarily clock that the other colors are smart glasses until you got up close.

…the first comment my partner made about the Frame is that they looked like the kind of prescription glasses toddlers use


The giveaway is that prism on the right lens, which undeniably stands out whether you’re wearing the glasses or not. No, this look still isn’t for everyone — the first comment my partner made about the Frame is that they looked like the kind of prescription glasses toddlers use — but I think Brilliant Labs has made the right trade-offs to put out their smart glasses at a relatively approachable price.

It helps that they’re very light. The Frame AI glasses weigh less than 1.4oz (40g), with most of the glasses’ weight centered on the bridge and the two battery cells at the end of each temple (the glasses “arms”). That makes the Frame easier to put on and take off, and easier to forget about while you’re wearing it.


Those positives aside, I wouldn’t necessarily recommend most people spend the money to actually get them. That’s mainly because, despite all the things I like about Brilliant Labs’ approach, the Frame hardware is pretty rough around the edges.

There are little details, like how loose the hinges on the glasses actually feel, how easily they unfold, or how sensitive the accelerometer that detects taps actually is. More than one time I was just adjusting the glasses and the microphone started listening. These are the kinds of things a larger company would probably tighten up (literally and figuratively) before shipping them out to backers.

The Frame AI glasses are comfortable, and as functional as promised. I’m just not sure that they’re what everyone imagines when they think of augmented reality or smart glasses.


Display

Did Brilliant Labs make the right compromise?

A tiny projection in the prism of the Brilliant Labs Frame.

The MicroOLED and prism combination that produces the Frame’s visuals isn’t discreet. Look closely and anyone on the outside can see that you’re viewing something. Even when the Frame is asleep, the prism creates strange reflections and a subtle magnification that you might have to get used to. But when the OLED is on, and you can see the Frame’s simple interface, texts and images are monochromatic, but very clear, with only the slightest amount of ghosting depending on whether your Frame’s interchangeable nose bridge is elevating the glasses to the right height.


Now, you don’t get the immersive spatial images you’ll find on a device like the Apple’s Vision Pro. The display on the Frame floats in front of your view. There are no depth sensors to help place imagery or any method of simulating different distances other than the fixed one the Frame’s prism offers. Of course, unlike the Vision Pro, the Frame doesn’t cover your face and obscure your eyes. When the display is off, they are mostly just normal glasses. That’s less futuristic, but probably a lot more practical.

Related

5 new GPT-4o features making ChatGPT better than ever

From real-time voice interaction to vision capabilities and multilingual support, we’re a step closer to Star Trek-style conversational AI.

Battery

Not quite all-day

The Frame glasses charging on a orange Mister Power charger.


Testing the battery life on the Frame is tricky because it seems to vary depending on what kinds of AI tasks you ask the glasses to complete. Just sitting in standby mode, not in use (and eventually disconnecting from my phone), the Frame lasted over a day, close to 30 hours. But with regular use, especially requests that used the Frame’s camera, that battery life came out to about 6 hours in my experience. I imagine if you really pushed the Frame’s camera, that could be shortened even further, but the default interaction with generative AI is text-based so far, which seemed to tax the Frame the least.

The decision to only include a 210mAh battery probably had a big impact on this performance, and it’s another one of Brilliant Labs’ compromises. You could definitely put a larger battery in the Frame, but it would make the slim glasses both heavier and thicker. To actually charge the glasses, Brilliant Labs provides an orange charger it calls “Mister Power” that clips onto the bridge of the glasses, giving them what looks like a little nose. The charger itself has a small 140mAh battery, which helped me top up the Frame on the go too.


It’s a cute idea, but I still found myself wishing there was a way to charge the glasses without having to remove the nose bridge. I frequently felt worried I was going to lose a nose bridge or the charger itself when I wasn’t using it. These are the kinds of details you want to address if normal people are going to buy your product, but can leave unfinished if developers who are willing to work around rough edges are your main audience.

Software

AI on your face just makes sense

Using the camera on the Brilliant Labs Frame to ask questions about a book.

Since the announcement of Humane’s AI Pin there’s been two ongoing questions that have plagued the current wave of AI gadgets:

  1. Does AI need to live anywhere other than the devices you currently own, like a smartphone, smartwatch, or laptop?
  2. Where should these AI devices be accessible?


Humane decided that AI interactions should live in a standalone device that’s pinned to your clothes. The cultural success of the Ray-Ban Meta smart glasses suggests there’s a lot of interest in an AI gadget that lives on your face and sees the world as you do, eyes-first. Having used the Frame for a little over a week, I think that’s a winning strategy.

A head-mounted camera and a multimodal AI model feels much more natural to use. Rather than showing something you have questions about to a pin or separate gadget, you just look at what’s made you curious and ask about it. The Frame’s camera is placed in the center of the glasses, and Meta’s are on the side, but the overall effect is the same.


For actually processing requests, Brilliant Labs offloads the heavy lifting to its Noa companion app on your phone, which in turn ferries requests to various models running in the cloud. The Noa app is used to set up Frame, tweak its settings, and view a written record of the things you’ve asked it. You can prompt the Frame’s AI assistant to refer to itself by a different name, use a different tone based on a prompt, and even write in a specific way (if, for example, you’d like it to only use iambic pentameter, for some reason).

The Noa app on an iPhone 13 mini held in front of the Brilliant Labs Frame.


The app and Frame’s overall functionality are otherwise bare-bones. The glasses can translate things you can see and display the translation in front of you, answer questions (even pulling from recent web results with a Perplexity integration), and do the basic text generation that all large language models can do, just by tapping and asking. It’s not a lot, but it’s helpful if you’re trying to save yourself a web search or don’t want to pull out your phone. And I imagine with the right apps from willing developers, it could get even more useful.

I asked Frame about the flowers in my house, books I was considering reading, and to answer questions I’d normally turn to Google to figure out. All the answers were satisfactory, if short, and in my tests, it usually took about three seconds to get a response. That isn’t the fastest, but it’s still pretty good considering all the distractions I have to sort through on my phone. It helps that the Frame understands context enough that you can ask follow-up questions without the glasses getting confused.


The only real problem I had with Frame was how frequently it disconnects from the Noa app. I suspect this is a limitation of how iOS handles Bluetooth connections that aren’t an Apple Watch or AirPods, but it’s something I wish Brilliant Labs could have found a way around, because it makes using the Frame annoying. Every time the glasses got disconnected, I had to open the Noa app and furiously tap on the side of the Frame to make the “tap me in” prompt reappear on the built-in display. Once I did that, the glasses never failed to connect, but it was annoying I had to do it in the first place, but that didn’t remove the annoyance.

Open-source software is the right choice

When I interviewed Bobak Tavanger, he was clear that he didn’t want Frame software developers to be restricted by the economics and review process of an app store. The Frame is open by default and Brilliant Labs is trying to make the process of creating experiences for the glasses straightforward.

There are form factors that make sense (glasses in this case), and interactions that feel like a given (voice at a bare minimum), but basically everything else is up in the air.


If augmented reality really is the future of computing like some tech companies believe, then having a platform where anyone can try and create an app is a necessity. No one has quite cracked how these devices should be used and what their interfaces should look like. There are form factors that make sense (glasses in this case), and interactions that feel like a given (voice at a bare minimum), but basically everything else is up in the air. And an open-source device gives tinkerers more wiggle room to experiment. I find that really commendable. Everything you need to start making apps for the Frame or create a Frame of your own is open and readily accessible on GitHub.

AI and AR are far from finalized

Taking the Frame out of their carrying case.


The “wild west” nature of both AR and AI are kind of a problem if you want to view Frame as a consumer product. There are few standards to govern how AR devices should work and few examples of AI being reliable enough for the average person to use. The limitations of AR software is one thing. I like the potential for experimentation, and developers may try to create interfaces more complicated than the simple text and emoticons that Brilliant Labs uses. That should all come with time.

The limitations of generative AI are harder to pin down. Large language models, even as they have improved, still regularly spit out weird, unoriginal, or straight-up wrong results. It doesn’t seem like that will change. They also aren’t necessarily useful for every task, and still require you to master the basics of prompting to get a good result, something that could take longer than just doing the work for yourself. Frame itself is also voice-first. What happens when you can’t ask for what you want out loud?


Brilliant Labs doesn’t have an answer, and I’ll be curious to see if anyone tries to create their own solutions, whether it’s a controller, or a way to type questions into the Frame directly. AI is what makes this device interesting right now, but I think the thing that will probably have the most staying power is the Frame’s simple approach to augmented reality. We don’t need a lot. Little pieces of contextual information placed into our field of view are enough. Especially if they’re delivered by a device that actually looks like a pair of glasses.

Related

Best smart glasses: Clear vision and music to your ears

The best smart glasses offer UV and blue light protection along with safe, open-ear listening and even cameras.

Should you buy the Brilliant Labs Frame?

Know what you’re getting into

The Brilliant Labs Frame hanging from a jacket.


If you’re a developer interested in making AR or AI experiences, absolutely consider the Frame. It’s a good-looking pair of glasses that seem purpose-built to be friendly to creators. If you’re similarly committed to being on the cutting edge, then consider the Frame. It offers something much more adventurous experience than what you can get from Meta. For everyone else, wait and see.

It’s not that the Brilliant Labs Frame are bad at what they set out to do, as a multimodal AI tool, they work exactly as well as you’d expect. I just think their rough edges mean that they won’t be as approachable as a smartphone or a tablet, and they probably won’t be as convenient as your own pair of glasses because of the simple fact that you have to charge them. Those are things Brilliant Labs or devoted independent developers can try and fix, but for now, the Frames you can buy are better as a proof of concept than anything else.

The clear Brilliant Labs Frame AI glasses on a white background.



Source link

Previous articleTwitter for Mac loss makes us grateful for iPhone mirroring