Wednesday, Dec 15, 2021, 16:57 Software

Apple Decides Against Implementing Controversial Photo Scan Feature

In August, Apple caused quite the stir when it announced a newly planned feature. The goal was to mitigate the spread of illegal pornographic material, particularly child pornography. The exact implementation would have abided by specific privacy rules. Opposed to most cloud service providers, Apple hadn't planned on scanning all of its users' cloud data – rather only local hashes before upload. Cupertino decided upon the route as mentioned above as it would have led to a higher degree of privacy protection than other methods. For example, the company has also stressed how other features are reliant on processing information directly on the device instead of over the cloud. In the case of Apple's anti-CSAM (child sex abuse material) measures, the announcement left a bad taste in the mouths of most users. The general public understood the announced measures as follows: "Apple is going to scan and evaluate all of my future photos!" Others asked whether or not it's the job of a private company or not to take up such an assignment.

Apple Backtracks On Its Announcement
It took about a month until Apple announced a change of course after a string of protests last summer. Cupertino added that it needed more tie to introduce the feature and would consider the plethora of feedback from concerned users, researchers, and other interests concerning the planned feature. Since then, there's been complete silence about the anti-CSAM measure. Apple hasn't voiced itself concerning the measure again, nor have any recent iOS updates or betas shown any sign of the feature. There was, however, a change with the release of iOS 15.2. Apple removed a passage cited from its child safety website and announced that the measure would no longer be seeing implementation.

advertising


advertising


The Hash Function: Still Present In iOS 15.2
The "NeuralHash" function that Apple introduced with iOS 14.3 and macOS 11.4 is still present within the system files. The function compares local hashes with hashes from known illegal photographic material. Still, photos themselves aren't scanned or compared – which is exactly what Apple emphasized when it underlined the potential feature's abidance by the company's longstanding, advertised philosophy of consumer privacy and data protection. The controversial feature would have only taken action when multiple algorithms recognized more than 30 photos stamped for upload to iCloud.

Plans Presumably Cancelled
Although it's still unclear whether or not Apple has fully cancelled the plans, it remains more and more unlikely that the company will implement the feature at all. Even more so unlikely is the possibility that Cupertino would "secretly" introduce the measure. The fact that Apple has removed the passage from its website stating its intentions regarding the feature should be proof enough that the company has had time to review and rethink its goals in the matter. If there were any other sparking ideas concerning how Cupertino might be able to prevent the spread of CSAM via iCloud, the company would have likely done more than simply delete its intentions regarding the matter from its own website. Should Apple still have intentions of preventing the spread of material via iCloud, it's starting to look as if only the ironically less data protective methods of Google, Microsoft, and other cloud service providers will remain for the company. Meaning, no local scanning would occur, and the entire process would take place over iCloud.

More mtech.news articles you might enjoy to read: