🎙️ Voice is AI-generated. Inconsistencies may occur.
Apple is taking time to refine its system that will scan U.S. iPhones, Macs and Apple Watches for images of child sexual abuse and delayed releasing the system that will be implemented through updates on Apple devices, the Associated Press reported.
The tech company announced the plan to use a tool last month to "limit the spread of Child Sexual Abuse Material (CSAM)," it says on its website. The tool will scan for and detect images of child sexual abuse before they can be uploaded to iCloud.
"Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," Apple said in a statement Friday at the top of its webpage that outlines the child safety plans.
Apple also seeks to use a separate tool to detect sexually explicit content in encrypted messages.
Jennifer Granick, a lawyer for the American Civil Liberties Union (ACLU), called the delay "a success for civil liberties groups' advocacy."
"It's great that Apple plans to engage with independent privacy and security experts before announcing their genius plans. They should start with end to end encryption for iCloud backups," Granick wrote on Twitter.
A success for civil liberties groups' advocacy. It's great that Apple plans to engage with independent privacy and security experts before announcing their genius plans. They should start with end to end encryption for iCloud backups. https://t.co/wMDkphPyEM
— granick (@granick) September 3, 2021
For more reporting from the Associated Press, see below.

Apple had said in its initial announcement that the latest changes will roll out this year as part of updates to its operating software for iPhones, Macs and Apple Watches.
Matthew Green, a top cryptography researcher at Johns Hopkins University, warned in August that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple's algorithm and alert law enforcement.
Not long after Green and privacy advocates sounded warnings, a developer claimed to have found a way to reverse-engineer the matching tool, which works by recognizing the mathematical "fingerprints" that represent an image.
Green said Friday that Apple's delay was the right move and suggested that the company talk to technical and policy communities and the general public before making such a big change that threatens the privacy of everyone's photo library.
"You need to build support before you launch something like this," he said in an interview. "This was a big escalation from scanning almost nothing to scanning private files."
Green said Apple might have been blindsided by the widespread pushback to a policy aimed at child safety because it was so secretive in developing the new technique, treating it the way it launches a new consumer product.
