How Enhanced Visual Search on iPhone upgrades the Photos app and protects your privacy


Apple’s Photos app employs multiple features to help you find images in your library and learn more about what’s shown in those images. One of those features is called Enhanced Visual Search. Here’s how it works and how Apple protects your privacy when you use it.

One key distinction to make is how Enhanced Visual Search differs from Apple’s Visual Look Up feature. Visual Look Up was introduced as part of iOS 15 and allows users to identify objects, landmarks, plants, and more in the Photos app.

For example, you can swipe up on an image in the Photos app to learn what dog breed is shown in the image. It can even recognize things like laundry care instructions on your clothes and what those random symbols on your car’s dashboard mean.

Enhanced Visual Search exists separately from Visual Look Up. Whereas Visual Look Up helps you find details on a single photo you’re already looking at, Enhanced Visual Search helps you find all the photos in your library when you search for a landmark or place. The feature works even when those photos don’t have geolocation data.

For example, you can search your library for “Golden Gate Bridge” and see relevant images from your library. The feature even works if the landmark is blurry and out of focus in the background of an image.

How does Enhanced Visual Search protect your privacy?

Earlier this month, headlines made the rounds about how Enhanced Visual Search sends your location information to Apple to help find those landmarks and points of interest. In the Settings app, Apple says: “Allow this device to privately match places in your photos with a global index maintained by Apple so you can search by almost any landmark of point of interest.”

This naturally raised questions about the privacy implications – particularly with the feature being opt-out rather than opt-in.

Apple, however, has a well-thought-out privacy pipeline in place that it says protects your data when Enhanced Visual Search kicks into action.

This process starts with something called homomorphic encryption, which works like this:

  • Your iPhone encrypts a query before sending it to a server.
  • The server operates on the encrypted query and generates a response.
  • That response is sent back to your iPhone, where it is decrypted.

Importantly, in Apple’s implementation, only your devices have the decryption key, not the server. The server is therefore unable to decrypt the original request. Apple uses homomorphic encryption for various features, including Enhanced Visual Search.

Apple also employs something called private nearest neighbor search, or PNNS, for Enhanced Visual Search. This feature enables a user’s device to privately query “a global index of popular landmarks and points of interest maintained by Apple to find approximate matches for places depicted in their photo library.”

Here is the full pipeline for an Enhanced Visual Search request, as Apple outlines on its Machine Learning website:

  • An on-device machine learning model analyzes a photo to determine if there is a “region of interest,” or ROI, that may contain a landmark.
  • If the model detects an ROI, it calculates a “vector embedding” for that part of the image.
  • That vector embedding is then encrypted and sent to a server database. The photo and pixels are not sent to Apple, but rather a mathematical representation of that “region of interest.”
  • Apple uses differential privacy combined with OHTTP relay, operated by a third party, to hide the device’s IP address before that request reaches Apple’s servers.
  • The client also issues “fake queries alongside its real ones, so the server cannot tell which are genuine.” Additionally, queries are routed through an anonymization network to ensure the server can’t link multiple requests to the same client.
  • The server identifies the relevant part of the embeddings and returns corresponding metadata, like landmark names, to the device. The server doesn’t retain the data after the results are returned to your iPhone.

That’s a lot of words and acronyms, but it backs up Apple’s claims that the company cannot learn anything about the information in your photos.

You can disable Enhanced Visual Search by going to the Settings app, tapping “Apps,” then “Photos.” Scroll down to the bottom, and there is a toggle for Enhanced Visual Search. Apple says this toggle should primarily be used in low-data zones.

Further reading on Enhanced Visual Search:

Follow ChanceThreadsBlueskyInstagram, and Mastodon

FTC: We use income earning auto affiliate links. More.





Source link

Previous articleSave $300 on this versatile Dell 2-in-1 productivity laptop
Next articleHere’s What I Want to See from Nothing’s First Flagship Phone