Apple was sued for storing violent material in iCloud
Apple has been sued, accusing the company of knowingly allowing its iCloud storage to store child sexual abuse material (CSAM). Thousands of victims of violence acted as plaintiffs. They claim that Apple’s inaction caused additional harm.
One of the co-authors of the claim, a 27-year-old woman, stated that she had been abused since childhood. She said that a relative molested her, recorded videos and shared images on the Internet. To this day, the woman continues to receive notifications from law enforcement agencies about the discovery of these images on various devices, and they were stored in Apple’s iCloud.
The lawsuit notes that in August 2021, Apple announced a new feature called CSAM Detection, which will use NeuralHash technology to detect such content stored in iCloud. Then it became known that the head of the company’s anti-fraud department, Eric Friedman, declared Apple “the largest platform for the distribution of child pornography.”
However, due to privacy concerns raised by activists and security researchers who feared potential misuse of the feature, Apple abandoned the program.
The lawsuit claims that the company’s decision demonstrates willful disregard for the safety of children. According to the document, “instead of using the tools it created to identify, remove and report violent images, Apple allowed this material to circulate, forcing victims to relive the trauma that shaped their lives.”
The lawsuit aims to force Apple to take robust measures to prevent the storage and distribution of CSAM on its platform. In addition, it provides for the payment of compensation to a potential group of 2,680 victims.
Apple has not publicly responded to the lawsuit. However, a representative of the company said that it actively implements innovations to combat crimes related to child sexual abuse, without jeopardizing the safety and privacy of users.
Tags: apple icloud scam violence children lawsuits