Published

Tech companies commit to fighting harmful AI sexual imagery by curbing nudity from datasets

  • Several leading artificial intelligence companies have pledged to remove nude images from their training datasets to curb harmful sexual deepfake imagery, according to a White House announcement.
  • The commitment includes companies like Adobe, Microsoft, and OpenAI, focusing on responsible use of data, as highlighted by the Biden administration.
  • These efforts are part of a broader campaign against image-based sexual abuse, notably targeting vulnerable groups like women and children.
Insights by Ground AI
Does this summary seem wrong?

0 Articles

All
Left
Center
Right
Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe

Bias Distribution

  • 80% of the sources lean Left
Factuality

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

Sources are mostly out of (0)