INSIDE

Apple scraps controversial plans to find known CSAM in iCloud Photos

Joe Rossignol

In addition to providing end-to-end encryption for iCloud Photos, Apple today announced the abandonment of its controversial plans to detect known child sexual abuse material (CSAM) stored in iCloud Photos, according to a statement. provided by WIRED.


Apple's full statement:

After extensive consultations with experts to gather feedback on the child protection initiatives we proposed last year, we are deepening our investment in the feature ” Communication security. which we first made available in December 2021. We have also decided not to promote our previously proposed CSAM discovery tool for iCloud photos. Children can be protected without companies reviewing personal data, and we will continue to work with governments, children's advocates and other companies to help protect young people, preserve their right to privacy and make the Internet a safer place for children and for everyone. us. .

In August 2021, Apple announced plans for three new child safety features, including a system to detect known CSAM images stored in iCloud Photos, a Communication Security setting that blurs photos sexually explicit content in the Messages app and child resources for Siri. Communication Safety launched in the US with iOS 15.2 in December 2021 and has since expanded to the UK, Canada, Australia, and New Zealand. Siri resources are also available, but CSAM discovery never started.

Apple originally said CSAM detection would be implemented in the iOS 15 and iPadOS 15 update by the end of 2021, but the company ultimately shelved the feature based on “feedback from customers, advocacy groups, researchers, and others.” Now, after a year of silence, Apple has completely abandoned its CSAM detection plans.

Apple has promised that its CSAM detection system is “designed with user privacy in mind.” The system would perform “on-device matching using a database of known CSAM image hashes” from child protection organizations, which Apple would convert to “an unreadable set of hashes that is securely stored on users' devices.”

Apple planned to announce about iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in partnership with US law enforcement. Apple has said there will be a “threshold” that provides “less than one chance in one trillion per year” of an account being mislabeled by the system, as well as manual review of flagged accounts by a human.

Apple's plans have been subject to criticism from a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, political groups, university researchers, and even some Apple employees.

Some critics have argued that this feature would create a “back door” in devices that governments or law enforcement could use to spy on users. Another issue was false positives, including the possibility that someone deliberately added CSAM images to another person's iCloud account to tag their account.

Note: Due to the political or social nature of the discussion of this topic, the discussion thread is located on our political news forum. All forum members and website visitors can read and follow this thread, but posting is limited to forum members with at least 100 posts.

[ 71 comments ]

Leave a Reply

Your email address will not be published. Required fields are marked *