Monday, Sep 06, 2021, 15:17 Software

A Change Of Plans: Apple Postpones Local Scanning Of iCloud Photo Libraries For Illegal Content

Apple clearly envisioned the situation developing slightly differently. Despite the public's reception, the recently proposed program – which would have involved locally scanning Apple customers' local iCloud photo libraries for illegal child pornography in hopes of preventing its spread – certainly held a noble goal in sight. However, the manner in which Apple introduced and communicated the program to the public and ignored or refrained from addressing the program's further implications (for example, if Apple decides to scan local iCloud photo libraries for child pornography today, what's to keep the company from deciding to scan more than just iCloud photo libraries, or to keep the company from happening upon private pictures?). Basically, no one disagrees with the fact that the spread of CSAM (Child Sexual Abuse Material) should be prevented. However, there were concerns about the proposed manner with which Cupertino intended to do so. In order to protect users' privacy and data, Apple had planned to perform the scans on devices locally as opposed to on servers... and it was exactly this decision that led to a great degree of scepticism and controversy concerning the program.

Enormous Implications
Firstly, there's the question as to whether or not Apple should even have access to any of users' local data, and secondly – whether or not providing Apple with access to users' local data is simply a slippery slope towards even more invasive endeavours. With respect to invasions of privacy in the past – particularly under fascist regimes (not to imply that Apple is one such, but this is often the source of the argument and concern), such measures often start with rather benign beginnings intended for "the good of all". Although according to Apple, the proposed program and measures were only intended to scan devices for known child pornography, critics were still sceptical. It's also possible that totalitarian countries would be able to adapt the local scans to further persecute human rights organizations and anti-government parties as in the past, when features of iOS have been abused by totalitarian regimes.



A Change Of Plan: No Immediate Introduction
After a rather vocal protest on the part of many customers and an equally vocal defence of the program on the part of Apple's press representatives, the company has now walked back on its initial announcement concerning the introduction of local device scanning. According to an official statement, the planned photo scanning won't begin in September as previously planned. Apple has expressed that it needs more time to develop the feature and the response from customers, engineers, and shareholders have shown Cupertino that it needs to more thoroughly think about how best to protect both children and the privacy of its users.

Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

As a result of the entire situation, the photo scan definitely won't be seeing a simultaneous introduction with iOS 15. When and in what form the program will be implemented remains to be seen. Other cloud-hosting platforms are already scanning content for known child pornography, albeit only when uploaded to a server and not locally. It's likely that Apple will decide upon something similar in the end. Even though it's normally better in the name of data privacy when scans are executed locally as opposed to on a server, local photo scanning raises its own questions and concerns. It will be interesting to see what announcement Apple makes in the coming weeks or months concerning this matter.

More articles you might enjoy to read: