NBC News: Google Says Attackers Used 100,000+ Prompts to Try to Clone AI Chatbot Gemini
Google reports over 100,000 prompts used to extract Gemini AI's model logic in attempts to clone its capabilities, calling it intellectual property theft by commercial and research actors.
5 Articles
5 Articles
IA and its chatbots are created to help users and give them the answers they need. However, this opens the door to behaviors that can distort their operation. It was what...
What Happens When You Ask a Chatbot to Reveal Itself?
Google's flagship AI is getting peppered with questions, not all of them innocent. In a new report , the company says its Gemini chatbot has been targeted by "commercially motivated" actors trying to reverse-engineer it—by asking itself exactly how it works. The tactic, known as model extraction, involves bombarding a...
Google Says People Are Copying Its AI Without Its Permission, Much Like It Scraped Everybody's Data Without Asking to Create Its AI in the First Place
Google has relied on a tremendous amount of material without permission to train its Gemini AI models. The company, alongside many of its competitors in the AI space, has been indiscriminately scraping the internet for content, without compensating rightsholders, racking up many copyright infringement lawsuits along the way. But when it comes to its own tech being copied, Google has no problem pointing fingers. This week, the company accused “co…
NBC News: Google says attackers used 100,000+ prompts to try to clone AI chatbot Gemini
NBC News: Google says attackers used 100,000+ prompts to try to clone AI chatbot Gemini. “Google says its flagship artificial intelligence chatbot, Gemini, has been inundated by ‘commercially motivated’ actors who are trying to clone it by repeatedly prompting it, sometimes with thousands of different queries — including one campaign that prompted Gemini more than 100,000 times.”
Coverage Details
Bias Distribution
- 67% of the sources lean Left
Factuality
To view factuality data please Upgrade to Premium



