Apple is being sued by victims of child sexual abuse after it abandoned a system that would scan iCloud for child sexual abuse materials (CSAM).
The lawsuit, filed late Saturday in the US District Court in Northern California, is seeking roughly $1.2 billion in damages from Apple for a group of 2,680 victims. By law, victims of child sexual abuse are entitled to a minimum of $150,000 in damages.
In the lawsuit, the victims claim that Apple failed to implement the child safety tools it promised or take any measures to detect and limit the distribution of CSAM on its devices, some through its iCloud service.
The lawsuit was first reported by the New York Times (NYT). The lawsuit was allegedly filed by a 27-year-old woman who was molested by a relative when she was still an infant. That relative shared images of her online. She said she still received law enforcement notices nearly daily about someone being charged for possessing images of her abuse.
Read Also: Google Flags Father as Child Abuser for Sending Photos of His Toddler to Doctors for Groin Issues
What Was Apple's Promise?
In 2021, Apple announced the NeuralHash system that would use digital signatures from the National Center for Missing and Exploited Children and other groups to scan iCloud libraries for CSAM content. If a photo matches 30 or more signals, it is immediately flagged for a manual review. Apple will then report the account owner to law enforcement if the content is confirmed to feature child sexual abuse.
However, the system appears to have been abandoned by Apple after security experts raised concerns about how it could be used by actors such as the government to implicate innocent victims. They also claimed the NeuralHash system could be manipulated to detect other materials that officials in authoritarian nations find objectionable.
How Did Apple Respond?
In a statement to NYT, Apple spokesperson Fred Sainz said they are "committed to fighting the ways predators put children at risk." He also added that the company is "urgently and actively" developing ways to address the distribution of CSAM materials on its service without putting the security and privacy of its users at risk.