Apple has released a new explanation for what its new technology is looking for in your iPhone photos.
And “Apple”, in a report published on the specialized technical website “The Verge”, indicated that its new technologies do not penetrate the privacy of its users, but rather scan through its cloud service “iCloud” inside the photos of “iPhone” phones of teenagers, as part of a set of security tools for children. .
She indicated that this technology searches for images of child sexual abuse materials on iPhones and iPads.
The company has released a new paper that delves into the precautions it hopes will increase user confidence in the initiative, and it includes a rule to refer only to images found in several child safety databases with different government affiliations, for more in-depth protection for children.
It is noteworthy that Apple’s new strategy has drawn sharp criticism from some crypto and privacy experts.
The document says that Apple will not rely on a single government database, such as that of the National Center for Missing and Exploited Children in the United States, and instead will only match images from at least two groups with different national affiliations.
The point is that no single government can have the power to covertly include irrelevant content for censorship purposes, since it won’t match a hash in any other database