IE 11 is not supported. For an optimal experience visit our site on another browser.

Apple delays plans to scan devices for child sexual abuse images

The system was built to look for images that match those from libraries assembled by law enforcement to find and track the dissemination of child abuse material on the internet.
Image: FILES-US-INTERNET-CHILDREN-IT-ASSAULT-APPLE
Apple were due to roll out iCloud scanning technology that reports known child sexual abuse material to law enforcement.Loic Venance / AFP - Getty Images

Apple has delayed plans to scan iPhones and iPads to look for collections of child sexual abuse images after backlash from customers and privacy advocates, the tech giant said Friday.

The company last month announced features aimed at flagging child sexual abuse images that users store on its iCloud servers. Apple did not say how long it will delay the program.

"Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material," according to a company statement.

"Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

The system was built to look for images that match those from libraries assembled by law enforcement to find and track the dissemination of child abuse material on the internet.

Some child safety advocates were disappointment by Apple's announcement.

“We absolutely value privacy and want to avoid mass surveillance in any form from government, but to fail children and fail the survivors of child sexual abuse by saying we’re not going to look for known rape videos and images of children because of some extreme future that may never happen just seems wholly wrong to me,” said Glen Pounder, chief operating officer of Child Rescue Coalition, a nonprofit that develops software to help law enforcement identify people downloading child sexual abuse material.

And Nelson O. Bunn Jr, executive director of the National District Attorneys Association lashed out at privacy advocates, who he claims failed "to articulate how the protection of children and the prosecution of offenders is unable to be balanced with the privacy concerns of Apples' customers."

"Prosecutors and law enforcement continue to face significant hurdles in the fight to end the abuse and exploitation of our society's most vulnerable victims," Bunn added.

But privacy advocates, who had feared Apple’s plans had the potential of opening innocent tech users to needless inspection, celebrated Friday’s announcement.

“I'm very happy to hear that Apple is delaying their CSAM technology rollout. It is half-baked, full of dangerous vulnerabilities, and breaks encryption for everyone,” tweeted Eran Toch, a professor in the Department of Industrial Engineering at Tel-Aviv University who specializes in privacy issues.

“Child safety deserves a solution which is more thoughtful and more impactful.”

Apple's effort seemed doomed from the start, with tech experts also saying the plan wasn't focused enough.

"I'm not OK with it (Apple's backtrack) but I'm not surprised, Apple bungled the release and the narrative got away from them very quickly," University of California, Berkeley computer science professor Hany Farid said.

Farid, who in 2009 developed the Microsoft tool PhotoDNA to help police and tech companies find and remove known images of child sexual exploitation, was particularly critical of Apple's plans that focused on images rather than video - which accounts for a majority of this abuse.

"They're 10 years late to the game, they were solving at best less than half the problem and you can circumvent the technology very quick" by just not storing 30 or more illegal images on iCloud, Farid said.

CORRECTION (Sept. 3, 2021, 2:26 p.m. ET): A previous version of this article misspelled the first name of the chief operating officer of Child Rescue Coalition. He is Glen Pounder, not Glenn.