Apple Intelligence on iPhone is now $200 cheaper — with just two small trade-offs


Want a new iPhone with Apple Intelligence? Before today, that would set you back about $800. After Apple’s latest iPhone announcement, however, the cost to access Apple Intelligence on iPhone is 25% less. There are just two very minor caveats.

Apple Intelligence is a suite of various features powered by artificial intelligence. Apple provides its AI tools with privacy in mind, using a stateless server system called Private Cloud Compute. This privacy-centric approach is unlike other modern AI systems that retain data on the server and use it for training models.

Top Apple Intelligence features include Writing Tools for improving or changing your writing style, Clean Up for removing unwanted subjects from photos, and Genmoji for creating custom emoji characters based on your description.

Visual Intelligence

Visual Intelligence is another component of Apple Intelligence. Most of Apple Intelligence is available on the iPhone, iPad, and Mac for now. So far, Visual Intelligence has been exclusive to the iPhone 16 and iPhone 16 Pro — with no support on iPhone 15 Pro or iPad, where Apple Intelligence is also available.

That’s mostly because of marketing and product differentiation.

iPhone 16 and iPhone 16 Pro have a new button called Camera Control. Pressing and holding the Camera Control is how you activate Visual Intelligence.

Once invoked, Visual Intelligence lets you use your rear-facing camera to aim at an object to reference in a ChatGPT query or Google visual search. More recently, Apple added the ability to add events to your calendar based on things like concert posters and board messages.

That’s where the first minor caveat for Apple Intelligence on the new iPhone 16e arises.

iPhone 16e has no Camera Control like the other iPhone 16 models, but Apple still supports the Visual Intelligence feature. The easiest way to invoke Visual Intelligence on iPhone 16e is to assign it to the Action button. The downside, however, is that doing this pretty much rules out using the Action button for anything else, like launching the Camera or muting your iPhone.

Fortunately, there’s another way to access Visual Intelligence on iPhone 16e without using the Action button, but it requires a few more steps. Rather than squeezing a physical button, Visual Intelligence can also be invoked through Control Center. For now, iPhone 16e is the only product with Visual Intelligence in Control Center. Maybe we’ll see this come to the iPad in the future. I wouldn’t wait around for Visual Intelligence to come to the iPhone 15 Pro.

Glowing Edge Light

Visual Intelligence aside, there’s one more extremely minor difference between Apple Intelligence on iPhone 15 Pro, iPhone 16, and iPhone 16 Pro compared to iPhone 16e.

Before today, each iPhone with Apple Intelligence featured the Dynamic Island, Apple’s adaptive interface around the camera sensor near the top of the display. iPhone 16e features the less modern notch design.

For that reason, the Apple Intelligence feature for Siri that Apple calls Glowing Edge Light appears a little differently on iPhone 16e than other Apple Intelligence iPhones. Instead of creating a uniform rainbow glow effect around the perimeter of the iPhone‘s display, the Glowing Edge Light wraps around the notch on the iPhone 16e.

Functionally, the Glowing Edge Light has no practical utility other than signaling that your iPhone is using the Apple Intelligence version of Siri rather than classic Siri. How the Glowing Edge Light appears is even more trivial unless you care about perfect symmetry — in which case you’ll need to hand over another 200 to 400 dollars.

Aside from these two minor differences, Apple Intelligence should work exactly the same on iPhone regardless of which model you buy.

FTC: We use income earning auto affiliate links. More.



Source link

Previous articleDon’t buy Avowed at full price, you can already get it for far less
Next articleBitcoin (BTC) Price Prediction & Analysis: Major Resilience Above $90k, What Next?