'Tool for grifters': AI deepfakes push bogus sexual cures
- AI-Generated videos on TikTok feature a shirtless man promoting an unproven supplement that allegedly enlarges male genitalia using carrot imagery as a euphemism.
- The rise of generative AI has enabled cheap mass production of deceptive videos with fake celebrity endorsements to evade content moderation enforcing policies on explicit language.
- Researchers, including Alexios Mantzarlis, note a surge of AI doctor avatars and deepfakes impersonating celebrities that promote dubious sexual remedies with millions of views.
- Zohaib Ahmed and Abbie Richards describe AI as a 'useful tool for grifters' producing misleading ads quickly, while Mantzarlis calls impersonation videos 'particularly pernicious' for online trust.
- The proliferation of AI-generated sexual supplement ads poses unique moderation challenges, making content removal a whack-a-mole issue and exposing consumers to potentially harmful products.
59 Articles
59 Articles
'Tool For Grifters': AI Deepfakes Push Consumers Into Buying Dubious Products
Rapid advances in artificial intelligence have fueled what researchers call an AI dystopia, a deception-filled online universe designed to manipulate unsuspecting users into buying dubious products.

'Tool for grifters': AI deepfakes push bogus sexual cures
Holding an oversized carrot, a brawny, shirtless man promotes a supplement he claims can enlarge male genitalia -- one of countless AI-generated videos on TikTok peddling unproven sexual treatments.
AI Deepfakes push consumers to buy dubious products
An oversized carrot held, a muscular shirtless man promotes a supplement that he claims can increase the male sexual organs-one of the countless AI-generated videos about ticktok that is not proven sexual treatments. The rise of generative AI has made it easy to produce and financially lucrative-to-use such videos with at least human supervision, often […]
Deepfakes, Scams, and the Age of Paranoia - WorldNL Magazine
“What’s funny is, the low-fi approach works,” says Daniel Goldman, a blockchain software engineer and former startup founder. Goldman says he began changing his own behavior after he heard a prominent figure in the crypto world had been convincingly deepfaked on a video call. “It put the fear of god in me,” he says. Afterwards, he warned his family and friends that even if they hear what they believe is his voice or see him on a video call askin…
Coverage Details
Bias Distribution
- 48% of the sources are Center
To view factuality data please Upgrade to Premium
Ownership
To view ownership data please Upgrade to Vantage