What's happening on your iPhone might not be staying on your iPhone after iOS 18
Apple has silently introduced a feature called Enhanced Visual Search that's toggled on by default and works by sending information to the company's server.
The Verge explains that the feature will identify a specific location when you swipe up on a photo you have taken of a building and tap "Look Up Landmark." A card will pop up with the name of the landmark and a link to a related article.
An on-device machine learning (ML) model analyzes a photo and decide if there is a region of interest (ROI) that may include a landmark. If the answer is affirmative, an encrypted request is sent to Apple's server, where it carries out a bunch of processes, such as applying homomorphic encryption and differential privacy, and sends back candidate landmarks to the iPhone. An on-device reranking model then takes over and predicts the best candidate.
The photo's metadata is then updated with the landmark label, allowing you to easily find it by searching for its name on your iPhone.
While there can be many use cases for the feature and many may find it useful, it shouldn't have been enabled by default, considering it sends information related to your photo to Apple.
Making matters worse is the fact that Apple never publicly announced the feature. Also, in case you are wondering, the feature is not a part of Apple Intelligence. I found the setting on my iPhone 14 Pro (and disabled it right away), which cannot run Apple Intelligence.
Enhanced Visual Search in Photos allows you to search for photos using landmarks or points of interest. Your device privately matches places in your photos to a global index Apple maintains on our servers. We apply homomorphic encryption and differential privacy, and use an OHTTP relay that hides IP address. This prevents Apple from learning about the information in your photos.
Apple
As Software Engineer Jeff Johnson explores in his blog post, this behavior is at odds with what the company professes about security and privacy practices.
Enhanced Visual Search lets your phone identify landmarks in your photos and search for those pictures using the name of the landmarks. | Image Credit - Apple
The Enhanced Visual Search setting was rolled out with Apple's latest operating system versions for Mac (macOS 15) and iPhones (iOS 18), which were released on September 16.
However, Apple first mentioned Enhanced Visual Search only on October 24, and that too discreetly. The feature allows your iPhone to identify photos of places, or most specifically landmarks, in your photo library.
Enhanced Visual Search is enabled by default on iPhones and Macs running the latest OS. | Image Credit - Lapcat Software.
However, Apple first mentioned Enhanced Visual Search only on October 24, and that too discreetly. The feature allows your iPhone to identify photos of places, or most specifically landmarks, in your photo library.
Enhanced Visual Search for photos, which allows a user to search their photo library for specific locations, like landmarks and points of interest, is an illustrative example of a useful feature powered by combining ML with HE and private server lookups. Using PNNS, a user’s device privately queries a global index of popular landmarks and points of interest maintained by Apple to find approximate matches for places depicted in their photo library.
Apple, October 2024
To determine the name of any given landmark, your iPhone will use a global index of well-known landmarks and points of interest kept by Apple to find matches for places in your album.
An on-device machine learning (ML) model analyzes a photo and decide if there is a region of interest (ROI) that may include a landmark. If the answer is affirmative, an encrypted request is sent to Apple's server, where it carries out a bunch of processes, such as applying homomorphic encryption and differential privacy, and sends back candidate landmarks to the iPhone. An on-device reranking model then takes over and predicts the best candidate.
While there can be many use cases for the feature and many may find it useful, it shouldn't have been enabled by default, considering it sends information related to your photo to Apple.
Making matters worse is the fact that Apple never publicly announced the feature. Also, in case you are wondering, the feature is not a part of Apple Intelligence. I found the setting on my iPhone 14 Pro (and disabled it right away), which cannot run Apple Intelligence.
Things that are NOT allowed: