See every side of every news story
Published loading...Updated

Here's How ChatGPT Was Tricked Into Revealing Windows Product Keys

JUL 10 – Researchers exploited ChatGPT's safety flaws by framing requests as guessing games, revealing valid Windows keys including one linked to Wells Fargo, showing AI guardrails remain vulnerable.

Summary by Tech Spot
As explained by 0DIN GenAI Bug Bounty Technical Product Manager Marco Figueroa, the jailbreak works by leveraging the game mechanics of large language models such as GPT-4o....

8 Articles

Center

Getting a free product key for Windows 11 or any other version is possible thanks to the artificial intelligence of OpenAI with a simple three-word trick.

·Madrid, Spain
Read Full Article

A new jailbreak method allows ChatGPT to produce valid Windows activation keys, reinforcing concerns about the security of generic AIs after the Copilot case.

Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe

Bias Distribution

  • 67% of the sources are Center
67% Center
Factuality

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

GBHackers On Security broke the news in on Thursday, July 10, 2025.
Sources are mostly out of (0)

You have read 1 out of your 5 free daily articles.

Join millions of well-informed readers who use Ground to compare coverage, check their news blindspots, and challenge their worldview.