Apple Inc said on Monday that all iPhone photos will be checked for known child sexual exploitation images if they are stored on iCloud on the Internet.
The disclosure comes as part of a series of media briefings through which Apple seeks to allay concern sparked by its announcement last week that it will scan users’ phones, personal computers and tablets for millions of illegal images.
While Google, Microsoft and other technology platforms scan uploaded photos or email attachments by matching them to databases from the National Center for Missing and Exploited Children and other centers specializing in this framework, cybersecurity experts fault Apple’s plan as being more intrusive.
Some even speculated that governments would seek to force Apple to expand the system to scan devices for other materials.
Apple said in a post on its website on Sunday that it would fight any such attempts, which could happen in special courts, making it difficult to find out.
“We’ve had requests for government changes that limit users’ privacy in the past, but we have strongly rejected these demands… and will continue to reject them in the future,” Apple wrote.
In a briefing yesterday, Apple officials said that the company’s system, which will be introduced this fall with the release of the iOS 15 operating system, will check files already on a user’s device if those images are uploaded to the company’s storage servers.
Apple has come under international criticism for under-reporting violations compared to other providers of this type of service. Some European jurisdictions are looking at legislation to make technology platforms more likely to be held accountable for the spread of such material.
Apple executives said Monday that device scans preserve privacy more than scanning iCloud directly. Among other things, the design of the new system does not tell Apple anything about user content unless it crosses a certain threshold of images, before the company authorizes its review.
The executives acknowledged that the user could fall prey to hackers who put pictures of child sexual exploitation on their device. But they said they expected such attacks to be extremely rare and that any review case would then look for other signs of hacking.
* Concerns about privacy
The company announced last week that if it detects such images uploading to iCloud, it will review them and inform law enforcement officials about the user before suspending their account.
Apple’s new system responds to requests from law enforcement authorities to help stop child sexual abuse.
“With so many people using Apple products, these new safety measures have the potential to save the lives of children who are seduced and sexually horrific online…Privacy and child protection,” John Clark, chief executive of the National Center for Missing and Exploited Children, said in a statement. They can actually coexist.”
Apple will store a database of child sexual abuse images on iPhones. When a user uploads an image to iCloud, the phone immediately issues a digital code for the image and compares it to the aforementioned database.
User account holding such images may end up being suspended. But Apple said that users who feel their accounts have been suspended without valid reason can appeal that decision and get them back.
However, privacy advocates said the system could open the door to monitoring political speech and other forms of content on iPhones.
“Apple has sent a very clear signal. It sees it as safe to put in place systems to scan users’ phones for prohibited content,” Matthew Green, an information security researcher at Johns Hopkins University, warned on Twitter. others that it deems inappropriate.
Other privacy researchers such as India McKinney and Erica Portnoy of the Electronic Frontier Foundation wrote in a blog that it may be impossible for independent researchers to double-check whether Apple is keeping its promises to scan only specific content on devices.
The move, they wrote, “is a shocking turnaround for users who are counting on the company’s leadership in privacy and security.”