Apple criticized for not doing enough to prevent the spread of CSAM

iMessage can warn minors about nude photos


AppleInsider may earn affiliate commissions for purchases made through links on our site.

Apple and Microsoft have provided details on their methods for detecting or preventing the dissemination of child sexual abuse material, and the Australian regulator has found their efforts to be insufficient.

Australia's e-Safety Commissioner has demanded that major tech companies such as Apple, Facebook, Snapchat, Microsoft and others detail their practices to prevent child abuse and exploitation on their platforms. The demand was made on August 30 and the companies had 29 days to comply or face fines.

Apple and Microsoft were the first companies to be scrutinized as part of this review, and according to Reuters, their efforts were deemed insufficient by the Australian regulator. Both companies do not proactively scan user files in iCloud or OneDrive for CSAM, nor do they use algorithmic detection algorithms for FaceTime or Skype.

Commissioner Julie Inman Grant called the company's reaction “alarming.” She stated that there has been “clearly inadequate and inconsistent use of widely available technology to detect and care for child abuse material.”

Apple recently announced that it has dropped its plans to scan photos uploaded to iCloud for CSAM. This method would be less and less effective as users now have access to fully encrypted photo storage and backups.

Instead of scanning existing content, stored or distributed, Apple decided to use a different approach that will evolve over time. Currently, devices used by children can be set by parents to alert the child if nudity is detected in photos sent via iMessage.

Apple plans to extend this capability to detect material in video, and then bring the detection and alerting system to FaceTime and other Apple apps. Over time, the company hopes to create an API that will allow developers to use the discovery system in their applications.

Opting out of CSAM detection in iCloud is welcomed by privacy advocates. At the same time, he was condemned by child protection groups, law enforcement and government officials.

Leave a Reply

Your email address will not be published. Required fields are marked *