AI being used to generate deepfake child sex abuse images based on real victims, report finds
- A report from the U.K.-based Internet Watch Foundation reveals an increase in AI-generated child sexual abuse material online.
- The IWF found 3,512 AI-created CSAM images and videos, a 17% rise from fall 2023 reviews.
- The report stresses that child protection must be prioritized in AI safety legislation to combat the growing threat.
8 Articles
8 Articles
AI-generated images of child sex abuse uses real victims as reference material
AI is being used to generate deepfake child sexual abuse images based on real victims (Picture: Getty) AI used to generate deepfake images of child sexual abuse uses photos of real victims as reference material, a report has found. The Internet Watch Foundation (IWF), who made the discovery, used the example of a girl named Olivia, not her real name, who was the victim of torture and rape between the ages of three and eight. Olivia, now in her t…
AI advances could lead to more child sex abuse videos, warns organisation
Advances in artificial intelligence are being used by paedophiles to produce AI-generated videos of child sexual abuse, which could grow in volume as technology improves, according to a watchdog.
Coverage Details
Bias Distribution
- 50% of the sources lean Left
To view factuality data please Upgrade to Premium
Ownership
To view ownership data please Upgrade to Vantage