BERKELEY, Calif. -- Apple is indefinitely delaying plans to scan iPhones in the U.S. for images of child sexual abuse following an outcry from security and privacy experts who warned the technology could be exploited for other surveillance purposes by hackers and intrusive governments.
The postponement announced Friday comes a month after Apple revealed it was getting ready to roll out a tool to detect known images of child sexual abuse. The tool would work by scanning files before they're uploaded to its iCloud back-up storage system. It had also planned to introduce a separate tool to scan users’ encrypted messages for sexually explicit content.
Apple insisted its technology had been developed in a way that would protect the privacy of iPhone owners in the U.S. But the Cupertino, California, company was swamped criticism from security experts, human rights groups and customers worried that the scanning technology would open a peephole exposing personal and sensitive information.
“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said in an update posted above its original photo-scanning plans.
Apple never set a specific date for when the scanning technology would roll out, beyond saying it would occur some time this year. The company is expected to unveil its next iPhone later this month, but it's unclear if it will use that event to further discuss its change in plans for scanning the devices in the U.S.
The photo scanning technology was “a really big about-face for Apple," said Cindy Cohn, executive director for the Electronic Frontier Foundation, one of the most vocal critics of the company's plans. “If you are going to take a stand for people's privacy, you can't be scanning their phones."
Cohn applauded Apple for taking more time to reassess its plans and urged the company to talk to a broader range of experts than it apparently did while drawing up its scanning blueprint in its typically secretive fashion.
Matthew Green, a top cryptography researcher at Johns Hopkins University and another outspoken critic of Apple, also supported the delay. He suggested the company talk to technical and policy communities and the general public before making such a big change that threatens the privacy of everyone’s photo library.
“You need to build support before you launch something like this,” Green said. “This was a big escalation from scanning almost nothing to scanning private files.”
When Apple announced the scanning technology last month, Green warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement.
Not long after Green and privacy advocates sounded warnings, a developer claimed to have found a way to reverse-engineer the matching tool, which works by recognizing the mathematical “fingerprints” that represent an image.
Apple traditionally has rejected government demands for data and access to devices that it believes are fishing expeditions or risk compromising the security of its customers or devices.
In a highly publicized act of defiance, Apple resisted an FBI demand in 2016 that the company crack the code protecting an iPhone used by one of the killers during a mass shooting in San Bernardino, California. It argued at the time that it would be opening a digital backdoor that could be exploited by hackers and other unauthorized parties to break into devices. In that instance, Apple was widely praised by civil rights and privacy groups.
O'Brien reported from Providence, Rhode Island. AP Business Writer Kelvin Chan contributed to this story from London.