Cerebras Systems, Amazon Strike Deal to Offer Cerebras AI Chips on Amazon's Cloud
The new AWS service will link Cerebras and Amazon Trainium3 chips to accelerate AI inference and aims to outperform Nvidia in price-performance, Amazon said.
- On March 13, Amazon.com and Cerebras Systems announced a deal to combine AWS Trainium3 chips with Cerebras CS-3 systems in AWS data centers to speed chatbots and coding tools.
- Using inference disaggregation, the partners will split workload between Trainium3 and Cerebras, a $23.1 billion startup, to boost speed and efficiency.
- Through Bedrock, AWS will offer the new combined inference service, and Amazon said it cannot yet compare it to Nvidia but expects better value, with speeds accessible in the next couple of months.
- Markets reacted as Amazon shares slipped 0.98% after AWS Vice President David Brown said 'The result will be inference that’s an order of magnitude faster and higher performance than what’s available today', and Andrew Feldman added the partnership will bring the fastest inference to global customers.
- Analysts note Nvidia's expected move next week to unveil a GPU-Groq pairing, while AWS plans to add support later this year for Amazon Nova and Cerebras' earlier $10 billion OpenAI deal highlights its scale.
13 Articles
13 Articles
Amazon's AWS Partners With Cerebras Systems To Deliver Faster AI Inference For LLMs - Amazon.com (NASDAQ:AMZN)
Amazon Web Services (NASDAQ: AMZN) and Cerebras Systems announced a collaboration on March 13 to deliver ultra-fast AI inference on Amazon Bedrock by combining AWS Trainium and Cerebras CS-3 hardware in a first-of-its-kind cloud deployment.
Cerebras Systems, Amazon strike deal to offer AI chips on AWS cloud
Amazon Web Services and Cerebras Systems have joined forces. They will combine their computing chips in a new service. This service aims to speed up AI applications like chatbots and coding tools. Cerebras chips will be integrated into Amazon data centres. They will work alongside Amazon's Trainium3 AI chips. This collaboration is set to enhance AI inference capabilities.
[Digital Daily Reporter Lee Sang-il] Amazon Web Services (AWS) is preparing a new AI inference service in collaboration with the AI chip startup Cerebras Systems. The strategy is to enhance the execution efficiency of AI software by combining AWS's self-developed Treynium processor with Cerebras chips. AWS announced plans to launch the new AI service in the second half of 2026 in partnership with Cerebras. The two companies have been preparing …
Amazon taps Cerebras wafer-scale chips to turbocharge AI models on AWS
Amazon Web Services said Friday it will put processors from Cerebras inside its data centers under a multiyear partnership focused on AI inference. The deal gives Amazon a new way to speed up how AI models answer prompts, write code, and handle live user requests. AWS said it will use Cerebras technology, including the Wafer-Scale Engine, for inference tasks. The companies did not share the financial terms. The setup is planned for Amazon Bedroc…
Amazon, Cerebras Partner to bring advanced AI chips to AWS Cloud
Amazon.com and Cerebras Systems have announced a deal to combine their computing chips for a new AI service on Amazon Web Services (AWS). Cerebras, valued at $23.1 billion, is a startup creating alternative AI chips and previously signed a $10 billion deal with OpenAI. The big picture: The Cerebras chips will be installed in AWS data centers and connected with Amazon’s custom Trainium3 AI chips using Amazon’s networking technology. The collabor…
Coverage Details
Bias Distribution
- 60% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium








