TECH

Apple plan to scan iPhone photos for child abuse fails

Apple-Proposed CSAM Discovery

   

AppleInsider may earn affiliate commissions for purchases made through links on our site.

While Apple's controversial plan to find child sexual abuse material using iPhone scanning has been cancelled, the company has other plans to stop it at source.

At the end of 2021, Apple announced two initiatives to protect children from abuse. One, which is already in place today, will warn minors before sending or receiving nudity photos. It works using algorithmic nudity detection and only warns children — parents are not notified.

The second, and much more controversial, feature would analyze user photos uploaded to iCloud on the user's iPhone for known CSAM content. The analysis was performed locally, on the device, using a hashing system.

Following backlash from privacy experts, child safety groups, and governments, Apple has put the feature on hold indefinitely for review. On Wednesday, Apple released a statement to AppleInsider and elsewhere explaining that it has completely retired the feature.

“After extensive consultations with experts to collect feedback on the child protection initiatives we proposed last year, we are deepening our investment in the communications security feature we first made available in December 2021.”< p>“We have also decided not to promote our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies scouring personal data, and we will continue to work with governments, child rights advocates and other companies to help protect young people, keep their right to privacy and make the Internet a safer place for children and for all of us.”

The announcement comes seconds after Apple announced new features that will end-to-end encrypt even more iCloud data, including iMessage content and photos. These enhanced protections would make the server-side flagging system that was a core part of Apple's CSAM discovery feature impossible.

Using a different approach

Amazon, Google, Microsoft and others to scan on the server side as required by law, but end-to-end encryption will prevent Apple from doing so.

Instead, Apple hopes to resolve the issue at its source — creation and distribution. Instead of targeting those who store content on cloud servers, Apple hopes to educate users first and prevent content from being created and submitted.

Apple provided Wired with more information about this initiative. Although there is no timeline for the features, it will start with an extension of algorithmic video nudity detection for the link security feature. Apple then plans to extend this protection to other means of communication, and then provide access to developers.

“Potential exploitation of children can be stopped before it happens by providing parents with additional tools to help protect their children from unsafe communications,” Apple said in a statement. “Apple is committed to developing innovative privacy solutions to combat child sexual abuse material and protect children while addressing the unique privacy and storage needs of private communications and data.”

There are other built-in protections in Siri, Safari, and Spotlight that determine when users search for CSAM. This redirects the search to resources that provide help to the person.

Features that educate users and preserve privacy have been Apple's goal for decades. All existing child safety implementations strive to inform, and Apple will never know when a safety feature is triggered.

Leave a Reply

Your email address will not be published. Required fields are marked *