Apple accused of ignoring child abuse material on iCloud


Apple cancelled its major CSAM proposals but introduced features such as automatic blocking of nudity sent to children



A proposed class action suit claims that Apple is hiding behind claims of privacy in order to avoid stopping the storage of child sexual abuse material on iCloud, and alleged grooming over iMessage.

A new filing with the US District Court for the Northern District of California, has been brought on behalf of an unnamed 9-year-old plaintiff. Listed only as Jane Doe in the complaint, the filing says that she was coerced into making and uploading CSAM on her iPad.

“When 9-year-old Jane Doe received an iPad for Christmas,” says the filing, “she never imagined perpetrators would use it to coerce her to produce and upload child sexual abuse material (“CSAM”) to iCloud.”

“This lawsuit alleges that Apple exploits ‘privacy’ at its own whim,” says the filing, “at times emblazoning ‘privacy’ on city billboards and slick commercials and at other times using privacy as a justification to look the other way while Doe and other children’s privacy is utterly trampled on through the proliferation of CSAM on Apple’s iCloud.”



Source link

Previous articleBitcoin skids to test the lows of the week, falling nearly $2000
Next articleThis phenomenal Pixel 9 deal takes the cake