Published • loading... • Updated
UK Study: 25% Find Nonconsensual Deepfakes Acceptable
A police-commissioned survey found 25% of respondents accept or are neutral about creating or sharing non-consensual sexual deepfakes, highlighting risks to women and girls.
- Crest Advisory, survey lead, found that one in four people felt creating or sharing non‑consensual sexual deepfakes was acceptable or neutral, with 67% having seen or might have seen a deepfake, in a survey of 1,700 in England and Wales published on November 24, 2025.
- The research followed evidence that prevalence has surged, with video deepfakes rising 1,780% between 2019 and 2024 and only 14% aware of the Data Act in England and Wales.
- Researchers noted demographic and behavioural links, including younger males under the age of 45 who watch pornography, with 5% creating deepfakes and one known prolific creator posting over 1,800 videos.
- Police said they commissioned the study to guide next steps and are now piloting image hashing and support projects with the Revenge Porn Helpline and NCVPP to protect victims and improve investigations.
- Ninety‑two per cent agreement on harm underpins calls for more research and education, as Crest Advisory and campaigners like Cally‑Jane Beech urge technology companies to act while addressing misogyny.
Insights by Ground AI
10 Articles
10 Articles
New Research Highlights Public Attitudes Towards Non-Consensual Sexual Deepfakes
A new study commissioned by the Office of the Police Chief Scientific Adviser and conducted by Crest Advisory reveals concerning public attitudes towards non-consensual sexual deepfakes - a growing form of technology-enabled abuse.
Coverage Details
Total News Sources10
Leaning Left2Leaning Right1Center0Last UpdatedBias Distribution67% Left
Bias Distribution
- 67% of the sources lean Left
67% Left
L 67%
R 33%
Factuality
To view factuality data please Upgrade to Premium





