Microsoft rolls out next generation of its AI chips, takes aim at Nvidia's software
Maia 200 is 30% more cost-efficient than competitors and designed to run large AI models faster in Azure data centers, reducing Microsoft's Nvidia dependence.
- On Jan 26, Microsoft unveiled Maia 200 in San Francisco, deploying it this week in the Iowa data center as 'the most efficient inference system Microsoft has ever deployed'.
- Tech giants are designing their own chips to cut reliance on NVIDIA, and Microsoft built Maia 200 to compete with Amazon Web Services and Google while addressing surging demand from generative AI developers.
- Built on Taiwan Semiconductor Manufacturing Co.'s 3-nanometer process, Maia 200 contains over 100 billion transistors, delivers over 10 PFLOPS and around 5 PFLOPS , and links four chips per server with up to 6,144 chips wired together.
83 Articles
83 Articles
The technological giant Microsoft presented this Monday Maia 200, the second generation of its artificial intelligence chip (IA) with which it seeks to reduce Nvidia's dependence and compete against Google and Amazon's in the cloud.
Microsoft built a 750W AI chip to challenge Nvidia's dominance, claims 3x performance gains over Amazon
Microsoft recently announced Maia 200, a new AI accelerator specifically designed for inference workloads. According to Redmond, Maia 200 can deliver "dramatic" improvements for AI applications and is already deployed in select US data centers on the Azure platform.Read Entire Article
Live Stock Market News: S&P 500 Moves Higher Before Big Tech Earnings
Live Updates Nvidia Holds Ground Despite Microsoft Chip Competition 41 minutes ago Live Nvidia traded essentially flat over the past week at $186.47 despite Microsoft unveiling its proprietary Maia 200 AI chip. The semiconductor leader’s resilience stems from strong institutional buying, including a 43.1% stake increase by FengHe Fund Management, and continued dominance in AI infrastructure. Samsung’s upcoming HBM4…
Does This New Chip Threaten Nvidia?
Key PointsMicrosoft says Maia 200 will serve multiple models, including the latest GPT models from OpenAI.Nvidia's data center business is still expanding quickly, even as hyperscalers invest in custom silicon.Over time, in-house inference chips could pressure pricing, raising the bar for Nvidia to stay ahead of the curve.10 stocks we like better than Nvidia › Microsoft (NASDAQ: MSFT) recently introduced a new in-house AI (artificial intelligenc…
Coverage Details
Bias Distribution
- 57% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium






















