
The West Virginia attorney general's office sued Apple on Thursday, claiming the tech giant allowed child sexual abuse materials to be stored and distributed on its iCloud service.
The lawsuit claims that Apple prioritized user privacy over child safety for years. The company has tight control over its hardware, software and cloud infrastructure, meaning it cannot claim to be unaware of the issue, the attorney general's office argued.
The lawsuit says US-based tech companies are federally required to report this detected content to the National Center for Missing and Exploited Children. While Google filed 1.47 million reports in 2023, Apple allegedly filed only 267.
"These images are a permanent record of a child's trauma, and that child is revictimized every time the material is shared or viewed," West Virginia Attorney General JB McCuskey said in a news release. "This conduct is despicable, and Apple's inaction is inexcusable."
"At Apple, protecting the safety and privacy of our users, especially children, is central to what we do. We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids," an Apple spokesperson said in a comment to CNN.
Apple also pointed to a feature the company offers called Communication Safety that warns children and blurs the image when nudity is detected while receiving or attempting to send content. It works in apps like Messages and FaceTime, as well as over AirDrop and in the iPhone's Contact Posters feature and the Photos app image selection tool. The spokesperson added that Apple's parental controls and features "are designed with the safety, security, and privacy of our users at their core."
Big Tech companies use tools like Microsoft PhotoDNA to detect child exploitation images, the West Virginia attorney general's office said. Microsoft says it provides this technology for free to qualified organizations, including tech companies. Apple said in 2021 it would use its own model called neuralhash to detect child sexual abuse materials, but then abandoned the plan following backlash from critics about privacy concerns. The complaint alleges neuralhash is a far inferior tool to photodna.
The lawsuit comes as there is increased scrutiny on the effects of Big Tech's impact on children. In 2023, the New Mexico Attorney General's office accused Meta of shutting down accounts it used to investigate alleged child sexual abuse on Facebook and Instagram. New Mexico Attorney General Raúl Torrez accused Meta in the lawsuit of creating a "breeding ground" for child predators on those platforms.
Meta strongly pushed back on the claims at the time, saying that "we use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators."
West Virginia's attorney general's office is seeking statutory and punitive damages, injunctive relief, as well as requirements for Apple to implement effective detection measures.
The-CNN-Wire & 2026 Cable News Network, Inc., a Time Warner Company. All rights reserved.