Skip to main content
See every side of every news story
Published loading...Updated

West Virginia Sues Apple Over Alleged Failure to Prevent Child Sexual Abuse Material

West Virginia alleges Apple prioritized user privacy over child safety, reporting only 267 CSAM cases in 2023 compared to millions by competitors, seeking damages and safer product designs.

  • On Thursday, West Virginia Attorney General JB McCuskey filed a consumer-protection lawsuit in Mason County accusing Apple of enabling CSAM storage and distribution.
  • Apple's 2021 plan to scan images included NeuralHash, Apple's 2021 CSAM-detection model, but the state says Apple prioritized privacy branding and abandoned detection, a choice JB McCuskey calls conscious.
  • The complaint points to a 2020 internal message where an Apple executive said `we are the greatest platform for distributing child porn`, citing Reuters data that Apple filed only 267 CSAM reports in 2023 compared to Google’s 1.47 million and Meta’s more than 30.6 million.
  • Apple has responded by moving to dismiss, arguing the firm is shielded from liability under Section 230 of the Communications Decency Act, a law that provides broad protections to internet companies from lawsuits over content generated by users.
  • The suit positions Apple at the center of a wider debate on end-to-end encryption and federal duties to report detected CSAM to the National Center for Missing and Exploited Children, marking the first government lawsuit of its kind amid 2024 litigation and watchdog findings.
Insights by Ground AI

79 Articles

KSBWKSBW
+8 Reposted by 8 other sources
Center

West Virginia sues Apple over alleged distribution of child sexual abuse materials on iCloud and devices

The West Virginia attorney general’s office sued Apple on Thursday, claiming the tech giant allowed child sexual abuse materials to be stored and distributed on its iCloud service.

Center

The West Virginia Attorney General's office sued Apple on Thursday, claiming that the tech giant allowed child sexual abuse materials (CSAM) to be stored and distributed in its iCloud service.

Read Full Article
Lean Right

A U.S. prosecutor announced this Friday an action against Apple, under the charge that his iCloud storage service serves as "refuge" to keep child sexual abuse material. iPhone 17e: what we know about Apple Zuckerberg's new smartphone 'baratinho' regrets that Instagram has been able to identify less than 13 years ago: 'We wish we had done this before' The action was presented by West Virginia's public prosecutor John Bohen McCuskey, who accused …

·Brazil
Read Full Article
Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe

Bias Distribution

  • 58% of the sources are Center
58% Center

Factuality Info Icon

To view factuality data please Upgrade to Premium

Ownership

Info Icon

To view ownership data please Upgrade to Vantage

Reuters broke the news in United Kingdom on Thursday, February 19, 2026.
Too Big Arrow Icon
Sources are mostly out of (0)

Similar News Topics

News
Feed Dots Icon
For You
Search Icon
Search
Blindspot LogoBlindspotLocal