Apple is reportedly facing a consumer protection lawsuit filed by West Virginia Attorney General (AG) John “JB” McCuskey. The AG of the US state has alleged that the Cupertino-based tech giant has failed to adequately prevent child sexual abuse material (CSAM) from being stored and shared through iPhones and iCloud services. The complaint claims that Apple prioritised privacy branding and its own business interests over child safety, while other technology companies have adopted detection systems to address such content.
According to a CNBC report, the lawsuit argued that companies, including Google, Microsoft, and Dropbox, have implemented tools such as PhotoDNA to identify and block illegal material. PhotoDNA, a tool that was developed by Microsoft and Dartmouth College in 2009, uses
to automatically detect child sexual abuse material (CSAM) images that have already been identified and reported to authorities.
If West Virginia’s lawsuit succeeds, it could require the company to introduce changes to its design or data security practices. The state is seeking statutory and punitive damages, as well as injunctive relief requiring Apple to implement CSAM-detection measures.
What Apple said about protecting customers’ privacy
In an emailed statement to CNBC, an Apple spokesperson said,
The company referred to parental control tools and features such as Communication Safety, which
as part of its efforts to provide the spokesperson added.
Why are some critics ‘not happy’ with Apple’s measures to protect customer privacy
In 2021, Apple introduced the CSAM detection feature that aims to automatically detect and remove images of child exploitation, as well as report content uploaded to iCloud in the U.S. to the National Center for Missing & Exploited Children. However, the company later withdrew the plans following concerns raised by privacy advocates, who argued that the technology could create a back door for government surveillance and could be modified to censor other types of content on iOS devices.
However, subsequent moves made by Apple have been widely criticised. A child protection group in the UK, called the National Society for the Prevention of Cruelty to Children, criticised the project in 2024 for its failure to monitor CSAM-related activity linked to its products.
In a separate case, a lawsuit filed in 2024 in the Northern District of California involved thousands of survivors of child sexual abuse suing Apple for failing to drop its earlier plans to detect CSAM and allowing such content to be shared online, which caused the survivors to relive their experiences.



