Zyphra Releases ZAYA1-8B, a Reasoning Model Trained on AMD and Optimized for Maximum Intelligence Density per Parameter
17 Articles
17 Articles
Meet ZAYA1-8B, a super efficient open reasoning model trained on AMD Instinct MI300 GPUs
Even as leading AI providers like OpenAI and Anthropic battle over the compute to train and release ever larger, more powerful models, other labs are going in a different direction — pursuing the development of smaller, more efficient models and often open sourcing them. The latest worth paying attention to comes from the lesser-known Palo Alto startup Zyphra, which this week released its new reasoning, mixture-of-experts (MoE) language model, Z…
Zyphra Releases ZAYA1-8B, a Reasoning Model trained on AMD and Optimized for Maximum Intelligence Density per Parameter
ZAYA1-8B delivers reasoning, mathematics, and coding performance competitive with models many times larger, achieving high intelligence density with under one billion active parameters trained on full-stack AMD infrastructure.
Zyphra Releases ZAYA1-8B: A Reasoning MoE Trained on AMD Hardware That Punches Far Above Its Weight Class
Zyphra AI has released ZAYA1-8B, a small Mixture of Experts (MoE) language model with 760 million active parameters and 8.4 billion total parameters. Trained end-to-end on AMD hardware, the model outperforms open-weight models many times its size on math and coding benchmarks, and is now available under an Apache 2.0 license on Hugging Face and as a serverless endpoint on Zyphra Cloud. With under 1 billion active parameters, ZAYA1-8B achieves sc…
Zyphra presented ZAYA1-8B, a new open model of MoE-trained reasoning on AMD infrastructure that, according to the company, manages to compete with much larger systems in mathematics, code and complex tasks thanks to a combination of architecture, reasoning-oriented pre-training and a test-time computing method called Markovian RSA. *** ZAYA1-8B uses less than 1B of active parameters and, according to Zyphra, exceeds several larger open models in…
Coverage Details
Bias Distribution
- 100% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium









