Apple’s AI vision via AirPods camera is in active development


Apple is working on bolstering future Apple Intelligence efforts by working on AirPods with built-in cameras.

One of the many ways Apple has moved forward with Apple Intelligence is the introduction of Visual Intelligence, a feature that is used to acquire information and to get details of real-world items using an iPhone 16. However, Apple is also considering making a version that doesn’t involve you pointing an iPhone camera at things in the first place.

According to Mark Gurman in Sunday’s newsletter for Bloomberg, Apple is “actively developing” a product that can combine AirPods with cameras.

The idea is to have the cameras being able to provide data on the surrounding environment that can be used in AI features. For example, a query asking where the user is located may use local sign scans as a way to determine an approximate location, or a user could be told which way a shop is based on storefront imagery.

Gurman doesn’t go into detail, but does state that the earbuds are being worked on by the company.

Better than smart glasses

The development of cameras in AirPods may sound like an unusual way of doing things, but it is a concept that has surfaced in rumors multiple times. Aside from being brought up in late January by Gurman, they also appeared in claims from December and October, with a potential arrival anticipated to be between two and three years away.

The cameras may not necessarily be full color version, but could instead use infrared sensors for depth mapping. Such a system could be used more for navigational purposes, and potentially use less power than the full video versions.

While some may consider adding the cameras to smart glasses, such as the repeatedly rumored Apple Glass, adding cameras to AirPods could be a better move overall. For a start, the viewing angle could be a lot more broad than the typically front-facing camera seen in existing smart glasses.

For the design of smart glasses, weight is a major factor. By adding hardware and batteries, you’re adding more weight to a typically lightweight item, which can gradually become more uncomfortable to wear for long periods.

There’s already one Apple audio product that adds an extra feature as a minor tangent. The Beats Powerbeats Pro 2 include a blood monitoring sensor in each earbud, which feeds back data to an iPhone, in a form that could feasibly take a lot of weight thanks to its use of an earhook.

There’s also the possibility of users being able to enjoy smart glasses-like features, without necessarily needing to wear glasses of any form. Or at least not needing to switch their traditional glasses frames if they are regular spectacle wearers.

Not everyone particularly likes the idea of wearing glasses at all, hence the existence of contact lenses as an alternative. The term “glasshole” springs to mind here.

By including most of the functionality, barring visual elements, smart earbuds that can “see” the environment and provide audio feedback could be a viable alternative to smart glasses.

If Apple can capitalize on the idea and make it work enough to impress the public, it could be onto a winner.



Source link

Previous articleEl Salvador bought 13 BTC in March despite IMF agreement
Next articleOpinion: Trump’s Strategic Bitcoin Reserve Is Just Another Scam