Skip to main content
See every side of every news story
Published loading...Updated

OpenAI Turns to Google’s AI Chips to Power Its Products, The Information Reports

SAN FRANCISCO, UNITED STATES, JUN 27 – OpenAI has started renting Google Cloud's Tensor Processing Units to reduce AI inference costs and diversify hardware beyond Nvidia GPUs, marking a strategic shift in compute resources.

  • On Friday, OpenAI began renting Google's TPUs to power ChatGPT, marking its first significant use of non-Nvidia chips via Google Cloud.
  • OpenAI shifted to Google TPUs to reduce inference costs amid rising Nvidia GPU prices and supply shortages.
  • OpenAI rents Google's TPUs via Google Cloud to meet soaring demand, aiming to reduce inference costs and challenge Nvidia's dominance.
  • OpenAI's use of Google's TPUs could challenge Nvidia's dominance by promoting TPUs as a more affordable AI hardware alternative, disrupting industry market leadership.
  • More broadly, OpenAI's adoption of Google TPUs aims to challenge Nvidia's dominance and foster greater competition in the AI chip industry.
Insights by Ground AI
Does this summary seem wrong?

27 Articles

Lean Right

So far, Nvidia and Microsoft have been the partners of the choice for ChatGPT inventor OpenAI. But now Sam Altman's company is changing and putting insider reports on Google.

·Vienna, Austria
Read Full Article
Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe

Bias Distribution

  • 50% of the sources are Center
50% Center

Factuality 

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

Techmeme broke the news in California, United States on Friday, June 27, 2025.
Sources are mostly out of (0)
News
For You
Search
BlindspotLocal