Hands On With iOS 18.2


Summary

  • iOS 18.2 adds a couple of exciting new features to Apple Intelligence, which is now available for everyone.
  • These include Image Playground (for creating images), Genmoji (for creating custom emojis), ChatGPT integration for Siri, and Visual Intelligence.
  • Though the features are fun and have some genuine utility, Apple Intelligence still needs some refinement.

A new iOS update means new Apple Intelligence features, but are they any good? I took the new AI-infused features for a spin and while some features impressed me, some still need work.

Image Playground Is Good but Limited

Screenshot of the Image Playground feature in action on and iPhone's screen.
Apple

After updating to iOS 18.2, you’ll see a new app on your Home Screen, with what appears to be a kitten’s face as the app icon. This is the new Image Playground app, Apple’s take on text-to-image generators. Unlike other prompt-based image creators, Image Playground restricts itself to cartoon-style headshots.

The first thing I did after opening the app was to type “dog” in the text field at the bottom, and in a few seconds, the app generated the image of a cute dog (more like a puppy). Then, I tapped on the “baseball cap” option from the suggestions, and the app changed the entire theme of the picture from generic to one with a stadium in the background and a baseball hat and clothes on the dog.

Three screenshots of the Image Playground app on iOS 18.2, showcasing the picture of a dog.

The app can also handle long descriptions like “a dog in a kitchen eating pie from the shelf” or “a flower on a mountain with blue petals.” You can also create an AI-generated animated picture of someone you know using their photos on your iPhone. I really like how the app lets me select between multiple images of someone (as it changes the appearance of the animated avatar).

It is important to mention here that the app offers two image styles, as you can see below, Animation is on the left and Illustration is on the right), and to me, the first one looks better. At any given point, Image Playground presents multiple renditions of the same picture. Further, you can add up to six suggestions.

Screenshots of the pictures created by the Image Playground app, with the animated picture on the left and the illustrated picture on the right.

Even though the Image Playground app is good at what it does, I don’t see myself frequently entering prompts and creating images.

Genmoji Is My New Favorite Apple Intelligence Feature

Remember Google’s emoji Kitchen? Well, Genmoji is an answer to that. The feature uses Image Playground to create custom emojis based on your input, though it doesn’t have a separate app. Instead, it is built into the iOS keyboard.

Screenshot of the iOS 18.2's Genmoji feature in action.

To use the feature, open the iOS keyboard in any app, tap the emoji icon at the bottom left, and then hit the colorful emoji icon at the top right. Now, you can start typing the description of the emoji you wish to create, and within a few seconds, Apple Intelligence develops one for you, fresh out of the Image Playground oven.

There isn’t much to talk about here as it usually works fine. I did notice a couple of glitchy responses, but re-entering the prompt solved the issue immediately. One thing that is worth mentioning for both the Image Playground app and the Genmoji feature is that using an appropriate reference image is vital to getting the right skin tone in the image.

Beyond that, Genmoji is a fun little addition to the iOS keyboard. In fact, I’ve been using it every other day to lighten up group conversations or to tease friends who are still using the iPhone 13 or who recently bought an iPhone 15 (as these phones don’t support the new Apple Intelligence features).

Siri’s ChatGPT Integration Can Answer Your Questions

With ChatGPT Integration, Siri can answer complicated queries, like those that involve compiling information from multiple sources, with ease.

For instance, asking Siri about “using salicylic and glycolic acid serums” now returns a proper response (compared with a few random web links in the older version). You can also ask Siri random questions about what you should pack on vacation to a particular location or what the meaning of CRR is in banking, eliminating the need to open Safari and perform a quick Google search.

Screenshots of Siri responses to complex questions (via ChatGPT).

However, there are a few things that stand out. First, Siri takes longer than usual to answer a question using ChatGPT (given that it accesses the OpenAI GPT-4o model). Second, you must be online. While using the feature, I noticed that Siri missed invoking ChatGPT a few times, but re-asking or re-phrasing the question solved the issue.

ChatGPT integration is the much-awaited Siri update iPhone users deserve, but the feature still needs some refinement.

Use Visual Intelligence to Learn More via the Camera

Visual Intelligence is a new visual-look-up feature for iPhone that combines the search process of Google Lens with ChatGPT’s reasoning capabilities.

To invoke Visual Intelligence, press and hold the Camera Control button (finally, something that makes sense for the button), and the camera interface shows up with a rainbow-colored glow. Now, point the camera toward an object and press the shutter or Camera Control buttons to capture a picture.

iPhone held in hand, with visual intelligence used to lookup a restaurant in the camera viewfinder.
Apple

Then, you can either look for the object via Google by pressing Search or send the picture to ChatGPT by selecting Ask. Either way, the feature takes a few seconds to respond. I used the Search feature on a slinky I had lying around, and it fetched me a couple of Amazon links with similar products. Google also shows the top keyword for the item in the picture (at the top right corner of the results).

Next, I used the feature to ask ChatGPT about the rings on my Apple Watch screen. Within a second, it told me that the rings represent a user’s fitness data, including exercise time, movement data, and stand hours, which is correct.

I also used the feature to learn about Clue, the popular mystery-solving card game. ChatGPT’s explanation was just enough to understand the basic premise.

Screenshots of the new Visual Intelligence feature in action on the iPhone 16, with the Search option on the left and the Ask option on the right.

However, while testing the feature, I got a prompt from ChatGPT, stating, “You’ve reached your daily limit for ChatGPT’s advanced capabilities. Additional requests will use the basic version for up to 24 hours.”

Apparently, Apple’s deal with OpenAI provides limited access to GPT-4o, and once you exhaust the limit (whether by asking Siri or Visual Intelligence questions), the chatbot resorts to its basic version until that timer resets.

Screenshot of the ChatGPT limitation prompt that showed up while testing Visual Intelligence.

You can also use the feature to learn more about a place, check Google reviews about it, extract text from a billboard or newspaper, and access web links and QR codes. While Visual Intelligence gives the Camera Control button a meaningful purpose, it also bridges the gap between the search-via-image features available on Android and iOS.


Apple’s latest iOS update is all about AI. Features like Genmoji, ChatGPT integration, and Visual Intelligence really elevate the Apple Intelligence experience on supported iPhones (bringing them a step closer to Samsung’s Galaxy AI). This makes iOS 18.2 a worthy update for enthusiasts.

If you’re fed up with all the new features, you can always turn Apple Intelligence off.



Source link

Previous articleBitcoin slumps as Trump euphoria gives way to wariness on Fed
Next articleSamsung’s pro-speed 512GB microSD card is now 50% off