Apple scrubs references to controversial Child Safety measures
All references to the policy are now gone…
What you need to know
Earlier this year Apple was forced to delay controversial Child Safety measures because of pushback.
The company had planned to use technology to scan iCloud Photo libraries for images of Child Sexual Abuse Material.
Apple has now removed all references to the policy on its Child Safety page.
Apple appears to have scrubbed any reference to its controversial Child Safety plans to scan iCloud Photo libraries for Child Sexual Abuse Material.
As noted by MacRumors, until December 10, Apple’s main Child Safety Page featured a full explanation and breakdown of Apple’s very controversial CSAM scanning plans. From Apple:
To help address this, new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.
The system used a database of hashes pertaining to known CSAM material, and wouldn’t have involved actually looking at or scanning any user photos, but was met with widespread user outrage and pushback from global figures including Edward Snowden, and various security and privacy experts.
The website also included the following statement from September 3:
Update as of September 3, 2021: Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
As of December 13, Apple’s Child Safety page now only contains a reference to Communication safety in Messages and expanded guidance for Siri, Spotlight, and Safari Search, the former having debuted on iOS 15 earlier this week.
At the time of delay Apple said in a statement “…based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”