Apple stated earlier today that iPhone users’ entire photo libraries will be checked for known child abuse images if they are stored in the online iCloud service. The disclosure came in a series of media briefings in which Apple is seeking to dispel alarm over its announcement last week that it will scan users’ phones, tablets and computers for millions of illegal pictures.
Some have said they expected that governments would seek to force the iPhone maker to expand the system to peer into devices for other material.
To that position, Apple’s response was swift and sure: “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.”
In the briefing on Monday, Apple officials said the company’s system, which will roll out this fall with the release of its iOS 15 operating system, will check existing files on a user’s device if users have those photos synched to the company’s storage servers.
Apple’s system does not check videos before they are uploaded to the company’s cloud, but the company said it plans to expand its system in unspecified ways in the future. For more on this, read the full Reuters report.
To counter overreactions by critics such Edward Snowden and the Electronic Frontier Foundation, Apple made it absolutely clear on Friday that it’s upcoming child safety features does not open a backdoor that reduces privacy.”
Apple further stated that its system is an improvement over industry-standard approaches because it uses its control of hardware and sophisticated mathematics to learn as little as possible about the images on a person’s phone or cloud account while still flagging illegal child pornography on cloud servers. It doesn’t scan actual images, only comparing hashes, the unique numbers that correspond to image files.
Below is a copy of Apple’s FAQ on “Expanded Protections for Children.”