iOS 18.4 added ‘intelligence’ to more iPhone cameras, but it’s just the start


The iPhone’s camera has long been one of its most important features. But late last year, Apple added a new capability that seeks to boost the camera’s importance even further: visual intelligence. Then in iOS 18.4, Apple brought the feature to many more iPhone cameras—but reportedly, it’s just getting started.

Visual intelligence just upgraded many iPhone cameras thanks to iOS 18.4

Visual intelligence is an AI feature that utilizes the iPhone’s camera. Here’s how Apple describes it:

Use visual intelligence to quickly learn more about the places and objects around you…look up details about a restaurant or business; have text translated, summarized, or read aloud; identify plants and animals; and more.

When visual intelligence first launched in iOS 18.2, it was exclusive to iPhones with a Camera Control button.

That means it only worked on the four iPhone 16 flagships.

Visual intelligence iOS 18.2

But in iOS 18.4, Apple expanded the feature to:

  • iPhone 15 Pro
  • iPhone 15 Pro Max
  • iPhone 16e

Any of these devices can use the feature by adding visual intelligence to the Action button, Lock Screen, or Control Center.

This expansion of the feature, which itself already added new capabilities in iOS 18.3, is just the start of Apple adding ‘intelligence’ to its devices’ cameras.

Cameras in AirPods and Apple Watch will continue visual intelligence expansion

AirPods Pro

Visual intelligence offers a variety of useful benefits today.

But Apple’s ambitions are apparently much bigger.

Per Mark Gurman, “Apple’s ultimate plan for Visual Intelligence goes far beyond the iPhone.”

He continues:

The company wants to put the feature at the core of future devices, including the camera-equipped AirPods that I’ve been writing about for several months. Along the way, Apple also wants to shift Visual Intelligence toward its own AI models, rather than those from OpenAI and Google.

But Apple’s vision for AI wearables goes even further. The company is working on new versions of the Apple Watch that include cameras. As with the future AirPods, this would help the device see the outside world and use AI to deliver relevant information.

Future AirPods and Apple Watch models gaining cameras is apparently deeply tied to Apple’s plans for visual intelligence.

The strategy seems to be two-fold:

  1. Add new features to visual intelligence over time, like iOS 18.3 did
  2. and bring ‘intelligent’ cameras to more devices over time, like iOS 18.4 did

Whether you find visual intelligence useful today or not, it seems Apple is just getting started with the feature.

In a few years, perhaps visual intelligence will be an indispensable part of not only our iPhones, but many of our Apple devices.

Do you use visual intelligence on your iPhone today? What should Apple do to improve it? Let us know in the comments.

Best iPhone accessories

FTC: We use income earning auto affiliate links. More.



Source link

Previous articleBitcoin, Ethereum, and Dogecoin Soar as “Risk On” Trade Continues