Meta Sues Developers of 'Nudify' Apps for Running Ads on Its Platforms
- On April 14, 2025, Meta filed a lawsuit in Hong Kong against Joy Timeline HK Limited for running ads promoting CrushAI, an AI app that creates nonconsensual nude images.
- The lawsuit follows investigations that revealed Joy Timeline repeatedly violated Meta's policies by circumventing ad review processes and running thousands of harmful AI nudify ads.
- Since early 2025, Meta's expert teams disrupted four distinct networks of accounts promoting AI nudify services and developed technology to detect ads lacking explicit nudity.
- Reports found over 8,000 CrushAI-related ads ran in the first two weeks of 2025, mostly on Facebook and Instagram, with Meta sharing over 3,800 unique URLs with other tech companies.
- Meta stated this legal action shows their commitment to combatting platform abuse and pledged to continue using legal and technical measures to protect their community from AI misuse.
102 Articles
102 Articles
Behind the Blog: Advertising and Aircraft
This is Behind the Blog, where we share our behind-the-scenes thoughts about how a few of our top stories of the week came together. This week, we discuss advertising, protests, and aircraft.EMANUEL: On Thursday Meta announced that it has filed a lawsuit in Hong Kong against Joy Timeline HK Limited, the company that operates a popular nudify app called Crush that we have covered previously. Meta’s position is that it hasn’t been able to prevent …
Good morning! AI apps that create fake nude photos have infiltrated Meta's platforms. Now the tech giant is taking legal action.
A new law is cracking down on AI 'revenge porn.' Police say it closes a legal loophole
COLORADO SPRINGS, Colo. (KRDO) - A new Colorado law hopes to mitigate the use of artificial intelligence to create sexually explicit images of real people, without their consent. At the beginning of June, Governor Jared Polis signed SB25-288 into law. The new law aims to expand Colorado's laws on nonconsensual pornography, or "revenge porn," to include AI-generated and altered imagery. "As technology improves, it is only a matter of time before …
AI: New law targets fake sexual image distribution
(COLORADO) — A new state law aims to protect people from the distribution of sexually explicit, digitally created, or altered fake images of themselves.The Colorado Springs Police Department (CSPD) said before this new law, they had no way to charge people for this crime. CSPD added that in 2024, its agency investigated a case in which someone created similar images. However, that person was not found in violation of any previous state laws. Off…
Coverage Details
Bias Distribution
- 71% of the sources are Center
To view factuality data please Upgrade to Premium