Tech companies commit to fighting harmful AI sexual imagery by curbing nudity from datasets
- Several leading artificial intelligence companies have pledged to remove nude images from their training datasets to curb harmful sexual deepfake imagery, according to a White House announcement.
- The commitment includes companies like Adobe, Microsoft, and OpenAI, focusing on responsible use of data, as highlighted by the Biden administration.
- These efforts are part of a broader campaign against image-based sexual abuse, notably targeting vulnerable groups like women and children.
Insights by Ground AI
Does this summary seem wrong?
0 Articles
0 Articles
All
Left
Center
Right
Coverage Details
Total News Sources0
Leaning Left12Leaning Right0Center3Last UpdatedBias Distribution80% Left
Bias Distribution
- 80% of the sources lean Left
L 80%
C 20%
Factuality
To view factuality data please Upgrade to Premium
Ownership
To view ownership data please Upgrade to Vantage