11:41
11:17
12:43
11:08
13:10
09:50
11:41
11:17
12:43
11:08
13:10
09:50
11:41
11:17
12:43
11:08
13:10
09:50
11:41
11:17
12:43
11:08
13:10
09:50
Apple is facing a class-action lawsuit accusing the company of knowingly allowing its iCloud storage service to host child sexual abuse material (CSAM).
The plaintiffs include thousands of victims who claim Apple’s inaction caused them further harm. One co-author of the lawsuit, a 27-year-old woman, stated that she suffered abuse since infancy. She revealed that a relative recorded videos and distributed images online using iCloud. The woman continues to receive notifications from law enforcement about these materials being found on various devices.
The lawsuit references Apple’s 2021 decision to abandon its CSAM detection program, which aimed to use NeuralHash technology to identify such content in iCloud. The program was postponed due to concerns raised by privacy advocates and experts over potential misuse. The plaintiffs argue that this decision shows a “deliberate disregard for child safety.”
The lawsuit demands that Apple implement effective measures to prevent CSAM storage and distribution on its platform. It also seeks compensation for a potential class of 2,680 victims, claiming Apple’s inaction has exacerbated their trauma by forcing them to relive the consequences of abuse.
Apple has yet to provide a public comment, though a company representative noted that Apple continues to innovate in combating child sexual exploitation without compromising user privacy.