Apple Faces $1.2 Billion Lawsuit Over Abandoned Child Safety Scanning System

· 1 min read

article picture

A group of child sexual abuse victims has filed a lawsuit against Apple Inc., seeking damages over the company's decision to abandon its planned iCloud scanning tool for detecting child sexual abuse materials (CSAM).

The lawsuit, filed in Northern California, represents a potential group of 2,680 victims and seeks damages exceeding $1.2 billion. The legal action centers on Apple's 2021 announcement and subsequent withdrawal of technology designed to identify and flag CSAM content stored in iCloud.

According to court documents, a 27-year-old plaintiff regularly receives notifications about their abuse images being discovered on Apple devices and iCloud storage. One such notification from late 2021 involved CSAM content found on a MacBook in Vermont.

The plaintiffs argue that Apple's failure to implement its proposed detection tools has allowed the continued circulation of abuse materials on its platforms. Under current laws, victims of child sexual abuse are entitled to minimum damages of $150,000.

Apple spokesperson Fred Sainz responded to the lawsuit, stating that the company remains committed to fighting child exploitation while maintaining user privacy. He highlighted existing features like Communication Safety, which warns children about potentially inappropriate content.

The legal challenge follows recent criticism from the UK's National Society for the Prevention of Cruelty to Children, which accused Apple of underreporting CSAM cases.

Apple originally abandoned its CSAM detection plans following widespread concerns from cybersecurity experts about potential privacy implications and the risk of government surveillance.

Note: Only one link was inserted as it was the only one contextually relevant to the article content. The other provided links about KDE Akademy and Australian misinformation law were not directly related to the article's subject matter.