Apple’s iCloud Photo Scanning Tool Is Alarming. Is There Hope For It?


    Apple’s intentions are good, but security experts think the CSAM detecting tool has wide scope for misuse. However, a middle-ground is not impossible.

    In August this year, Apple announced a new child safety feature to scan the iCloud library to look for Child Sexual Abuse Material (CSAM) images and report them to authorities accordingly. The feature will be baked into iOS and iPadOS. It will scan for photos before storing them on Apple’s cloud service. The announcement was met with a lot of resistance from privacy advocates and the research community that has experience with handling such technology.

    Apple sourced comments from child safety groups and cybersecurity experts praising the initiative. But for a company that has resisted attempts at bypassing iPhone encryption, even for law enforcement authorities, Apple’s move has been likened to a narrowly-scoped backdoor that can be expanded at its whim or under pressure. Lengthy Apple documents have talked about the system’s efficacy and how it doesn’t impact privacy, but there are still a lot of concerns. The first question is whether the scope of objectionable content will remain limited to CSAM imagery or if Apple will budge under political pressure.


    Related: Can Apple’s AirTags Track People & Do They Protect Privacy?

    Apple’s allegiance to concepts like ‘freedom of expression’ flow in a different stream in China, compared to what the company advertises in the United States or the rest of the world. Apple says it won’t make any compromises, but in a country that forms the backbone of its manufacturing operations and accounts for a considerable chunk of revenue, the stakes for non-compliance with the government’s will are incredibly high. Russia and multiple African nations have recently tightened the restrictions on platforms around content deemed objectionable and have been pressuring companies to hand over more data. Researchers say Apple’s CSAM scanning system is a dangerous technology and a privacy loophole waiting to be exploited. At the same time, some have even called it the beginning of an Orwellian mass surveillance system in the garb of child safety.

    Apple Needs To Be Proactive, Not Pompous

    Apple CSAM Scan on iPhones for Child Abuse Imagery

    Apple has talked extensively about the system’s accuracy, but hackers have already managed to trick an early version of it into generating a false positive alarm when exposed to a non-CSAM image, according to Vice. The company says it has to obey the US laws around inappropriate content and is sourcing the CSAM database from NCMEC (National Center for Missing & Exploited Children). However, other countries may ask Apple to rely on CSAM image hashes from local child safety organizations, which may not be as effective and would further complicate the development process of a system that has already been delayed. Apple has also not built any reliable framework around the CSAM parental alert feature that can prove to be an issue for children in abusive homes.

    There is no doubt that the initiative is noble, but for a company that prides itself on the privacy and security of its ecosystem, it needs to do more than just release FAQ papers and downplay the risks in interviews. Apple doesn’t have to shut down the project. What it must do is address the perceived flaws and provide more than just a verbal guarantee. Apple needs to treat its customers as stakeholders when it comes to privacy, and if possible, provide a legally binding assurance that the CSAM system won’t be exploited. That way, millions of iPhone users will have confidence that they can hold Apple accountable and get compensated for damages if their privacy is violated.

    Via The Washington Post, Cybersecurity experts who have previously worked on similar projects say the whole idea is dangerous, while privacy advocates claim it sets a dangerous precedent. Apple needs to hold a transparent and open dialogue with the cybersecurity community and assure them with practical demonstrations that its CSAM photo-scanning system cannot be tricked or exploited. Doing so will take some time and a lot of caution, but it would go a long way toward addressing legitimate concerns.

    It is Apple’s responsibility to engage in a constructive dialogue with experts and arrive at a middle ground where the CSAM scanning tech can improve without being invasive. The company definitely has the reach and resources to accomplish that. Not doing so would be an arrogant stance — one that will ultimately betray Apple’s very own principles of privacy. More importantly, it will open the floodgates for more invasive systems that jeopardize the privacy of billions of smartphone users, leaving them at the mercy of obscenely wealthy corporations and authoritarian governments.

    Next: Why Apple’s Tim Cook Says Tech Privacy Regulation Is Needed

    Sources: arXiv, VICE, The Washington Post

    Trubkina Gibbs Instagram In 90 Day Fiance

    90 Day Fiancé: Julia Addresses Her Illness Fans Noticed On Pillow Talk


    About The Author





    Source link

    Previous articleCrypto Charity Token Pawthereum Donates Over $10,000 to Animal Shelters and Rescues…
    Next articlePCs have a webcam problem, but Apple’s solution is notch the right one