OpenAI says it has no plan to use Google's in-house chip
UNITED STATES, JUN 30 – OpenAI continues to rely on Nvidia and AMD chips while developing its own AI processor, aiming to reduce third-party dependence and optimize AI performance by year-end.
- In early July 2025, OpenAI clarified it has no plans to deploy Google's in-house AI chips at scale, despite testing TPUs two days after reports suggested potential use.
- Following reports, OpenAI clarified it has no plans to use Google's TPUs at scale, citing scale and infrastructure challenges amid rising hardware costs.
- OpenAI relies mainly on Nvidia GPUs and AMD chips, with ongoing Google Cloud partnership and continued use of CoreWeave servers for computing power.
- OpenAI’s custom chip development is on track for a 2025 tape-out, reducing reliance on third-party hardware amid growing AI hardware demand.
- Beyond current developments, OpenAI's focus on developing its own chip aims to ensure independence amid rising AI hardware demand, which has significantly boosted Nvidia's industry valuation.
18 Articles
18 Articles
OpenAI says no plans to use Google's AI chips at scale
OpenAI said it has no active plans to use Google's in-house chip to power its products, two days after several news outlets reported the AI lab was turning to its competitor's artificial intelligence (AI) chips to meet growing demand.

OpenAI says it has no plan to use Google's in-house chip
OpenAI said it has no active plans to use Google's in-house chip to power its products, two days after Reuters and other news outlets reported on the AI lab's move to turn to its competitor's artificial intelligence chips to meet growing demand. A spokesperson for OpenAI said on Sunday that while the AI la
OpenAI Confirms It Won’t Scale Use of Google’s In-House Chips
OpenAI has clarified that, despite initial trials, it does not currently plan to deploy Google's Tensor Processing Units (TPUs) at scale for its AI services. The announcement follows reports suggesting the company was exploring TPUs to support growing compute needs. A company spokesperson told that OpenAI is in early testing with some of Google’s chips but has no active intentions to roll them out broadly. For now, the company continues to rely …


With OpenAI, there are no allegiances - just compute at all costs
Google's TPUs might not be on Altman's menu just yet, but he's never been all that picky about hardware Analysis No longer bound to Microsoft's infrastructure, OpenAI is looking to expand its network of compute providers to the likes of Oracle, CoreWeave, and apparently even rival model builder Google. . . .
Coverage Details
Bias Distribution
- 57% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium