Get access to our best features
Get access to our best features
Published 1 year ago

Apple confirms that it has stopped plans to roll out CSAM detection system

Summary by Ground News
Apple will no longer scan for Child Sexual Abuse Material (CSAM) in iCloud Photos. The news was confirmed by Apple's vice president of software engineering Craig Federighi in an interview with WSJ. The move was widely criticized due to privacy concerns. Other child safety features such as restrictions on iMessage are still available in iOS.

0 Articles

All
Left
Center
Right
Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe
Ground News Article Assistant
Not enough coverage to generate an Article Assistant.

Bias Distribution

  • 100% of the sources are Center
100% Center
Factuality

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

Sources are mostly out of (0)