The next major iOS 18 update will deliver the Apple Intelligence-based Visual Intelligence feature to iPhone 15 Pro users, it has emerged.
The latest iOS 18.4 beta includes the visual search feature (via 9to5Mac), which has been available on the best new iPhone 16 models since the launch of iOS 18.2 in December.
Get the iPhone 14 Pro Max for just £579
Originally costing £1199 at launch, the iPhone 14 Pro Max can now be had for just £20 less than the ‘entry-level’ iPhone 16e, making it an instant bargain for getting flagship features on the cheap.
- Giffgaff
- ‘Like new’ refurbished
- Just £579
Because those phones have the unique Camera Control button, which enables easy launch of the feature by holding the physical button down, it wasn’t clear whether there was an easy path to bring the Visual Intelligence feature to iPhone 15 Pro users, despite them being promised access to Apple Intelligence features.
When the iPhone 16e arrived last month with access to Visual Intelligence via the Action Button it became clear there’d be little reason why iPhone 15 Pro owners couldn’t use it too.
Apple reportedly told Daring Fireball that Visual Intelligence would be making its way onto the iPhone 15 Pro models in a future software update, and it appears it’ll be iOS 18.4 due next month.
As with the iPhone 16e, iPhone 15 Pro users will need to customise their Action Button to launch Visual Intelligence and from there they’ll be able to point the lens at the content in question, then ask ChatGPT or Google Search for more information.
Interestingly, the beta also makes it possible for owners of the iPhone 16 range to set the Action Button to trigger a Visual Intelligence search, as well as through a tile in the customisable Control Centre quick settings menu.
The feature, which effectively the iPhone’s answer to Google’s Circle to Search, can be used in a number of ways. You can snap an image of a dog to discover the breed, take a picture of an item and find a similar one for sale online, or you can scan a poster to have all of the details saved to your camera.
Unimpressed so far
I have an iPhone 16 Pro Max and, to be honest, Visual Intelligence has been a bit of a let down so far. There doesn’t seem to be a whole lot that feels revolutionary. iPhone models have been able to identify dog breeds for ages now, while Google Lens has also been making it easier to find things online you’ve noticed in the real world. It doesn’t feel next-gen AI to me. That being said, I haven’t enabled ChatGPT and won’t be either.