Google Commits $750M to Agentic AI, Unveils New Chips
Google said the new chips are designed to cut training cycles from months to weeks and deliver 80% better inference price-performance.
- On Wednesday, Google unveiled its eighth-generation Tensor Processing Units at Cloud Next in Las Vegas, splitting the product line into the TPU 8t for training and the TPU 8i for inference.
- Google Senior Vice President and Chief Technologist for AI and Infrastructure Amin Vahdat said the specialized chips address the rise of AI agents, which require distinct hardware for efficient training and serving.
- The training-focused TPU 8t delivers 124% more performance per watt than prior generations, while the TPU 8i improves that metric by 117% using high-bandwidth memory to overcome the "memory wall."
- Both accelerators will become generally available later this year as instances on Google Cloud Platform, offering 2.8 times the training performance of the previous Ironwood TPU at the same price.
- Moving beyond x86 processors, Google is utilizing its homegrown Arm-based Axion CPUs for TPU hosts, joining rivals Amazon, Microsoft, and Nvidia in developing custom silicon to reduce dependence on external suppliers.
59 Articles
59 Articles
Wall Street Pro Thinks Google’s AI Chip Edge Is Getting Harder to Ignore
The post Wall Street Pro Thinks Google’s AI Chip Edge Is Getting Harder to Ignore appeared first on 24/7 Wall St.. Quick Read Alphabet (GOOG) unveiled its eighth-generation TPUs (tensor processing units), the TPU 8i and 8i, which offer 80% memory improvement and meaningful cost savings compared to prior generations. As AI workloads shift from training to inference, Google’s TPUs are gaining traction with efficiency-focused AI firms. The anal…
Google unveiled on Wednesday two new chips for artificial intelligence (AI), one to train the powerful new generative AI models, the other for the fast and economical use of daily life, whose demand could explode with the rapid global deployment of autonomous AI agents.
San Francisco.- The Google Cloud division of Alphabet Inc. presented the latest generation of its tensile processing unit (TPU), its own development chip designed to optimize and streamline AI computing services. The new range will be available in two versions, announced the company at its Google Cloud Next event, where it also presented a $750 million fund to boost the adoption of AI in companies and showed tools for the creation of AI agents. …
Coverage Details
Bias Distribution
- 55% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium




















