Tuesday, Jan 26, 2021, 13:16 Software

A Look At The Photos App's Intelligent Image Analysis

In the age of the smartphone, snapping a quick picture only takes the tap of a finger – and the resulting number of images in a smartphone's photo library has become almost impossible to keep track of. A recent whitepaper from Apple sheds light on some of the background processes, like image analysis, involved in simplifying the task of sorting an iPhone user's photo library. Of course, it's always possible to view and scroll through the images in chronological order, but Apple also offers several other ways to sort and process photos.

What The Photos App Analyses
For the app to make specific recommendations in the "For you" tab of the, or to allow users to search for specific image content, several analyses take place. Apple documents, amongst others, several parameters under which this takes place:

  • Well-known shapes like airplanes, bicycles, landscapes, cities, and hundreds of other objects are automatically detected, users can enter a word such as "guitar" in the search tab and be shown any images with guitars in them.
  • The lighting, sharpness, colors, composition, and symmetry of an image are evaluated according to a general criterion.
  • Photos are categorized, for example, photos of people, dogs, cats, etc.
  • Photos are analyzed according to a quality index for facial recognition (lighting, focus), including an interpretation of the facial expression.
  • The app also categorizes videos according to auditory input, for example, applause, laughter, etc.

The Photos app also records the preferences of the respective user and allows the user's behavior to affect the app's assessments. This makes it possible for the app to offer individual results in searches and also to display "memories", or photos with an interpreted meaning to the user. When sorted by "Days", the Photos app can hide duplicate photos or only show photos of the highest quality – providing the user with a better overview.



The Best Photos of The Day
The best photos of the day are chosen according to the criterion mentioned above and also via a relevance analysis. Apple indicates that scenery recognition plays an important role in this process. For example, if a user takes a large number of pictures throughout a day; then snaps a few at an event, the app can recognize pictures of the event as the most important of the day – despite there being fewer pictures of the event. Photo composition and motifs are also central criteria, although the algorithm learns when the preferences of the individual deviate from the standard criteria. Photos that have been marked, edited, or sent to others hold a higher index rating. The same applies to images sorted by "Months" or "Years", in which specific events are identified and then given a higher rating – birthdays and conferences are good examples.

Automatic Image Enhancement
The Photos app on iOS, iPadOS, as well as macOS, offers countless editing tools. However, more than the "Auto-Enhance" button is rarely needed to achieve a good result. This is another example of an algorithm used by the Photos app – one trained using millions of photos to achieve the best possible lighting, color, highlighted image content via darkening and lightening, removal of image noise, or modification of color temperature. Of great importance to this process is the recognition of image content, not just for arranging the histogram, but also for taking desired content into account. This is to avoid the application of the same criteria to a human face as to water in the background of an image.

One Photo, Many Levels
It gets even more complicated when functions such as portrait mode are taken into account. Photos taken in portrait mode can't be simply analyzed, they have to be broken up into "semantic segments". For example, the photo is separated by background, the outlines of people, faces, mouths, hair, and teeth so that a proper analysis can take place. Thanks to the computing power of the neural engine, this occurs mostly in real-time – even when certain parts of the image are concealed, hair or beards obscure the faces of people, or in poor light conditions.

Everything with Privacy In Mind
Apple also addresses the topic, "data protection and privacy", which also plays a role in the analysis process. While many photo services undergo server-side image analysis, which inevitably leads to photos leaving the user's device, the Photos app performs these functions locally. The knowledge graph of the user's preferences and interests remains on the local device and isn't present anywhere else. This data is only able to reach a server when a user shares images with others, stores them in their own iCloud photo library, or uses iCloud Backup. Disclosure of the user's information to third parties is not permitted.

More mtech.news articles you might enjoy to read: