Ant Group Open-Sources Ling-2.6-Flash Model with Multiple Precision Options
4 Articles
4 Articles
Ant Group Open-Sources Ling-2.6-Flash Model with Multiple Precision Options
April 29, 2026 — Ant Group’s BaiLing large model team has officially open-sourced Ling-2.6-Flash, offering multiple precision formats—including BF16, FP8, and INT4—to give developers flexibility across hardware environments, inference costs, and deployment needs. Ling-2.6-Flash is an instruction-tuned model with 104 billion total parameters and 7.4 billion activated parameters. Two weeks ago, it was quietly released on OpenRouter under the anony…
Ant Group open-sources Ling-2.6-flash, targeting agent workflows
Ant Group has open-sourced Ling-2.6-flash, offering multiple quantized versions to meet diverse hardware environments and enterprise deployment needs. The post Ant Group open-sources Ling-2.6-flash, targeting agent workflows appeared first on CnTechPost.
Read the original note in the following link: DeepSeek V4: The open AI model that competes with Claude and GPT at lower cost DeepSeek-V4 again installs a pressure on the market of advanced models, because its argument does not rest solely on performance. The family arrives with open weights, support for a context window of 1 million tokens and a price structure that remains far below the high-end proprietary models. The comparison is especially …
With DeepSeek-V4, the Chinese developer in charge is launching an extremely low-cost AI model that costs only a fraction of GPT-5.5. However, with complex logical conclusions, the system remains behind the frontrunners. (Read more)
Coverage Details
Bias Distribution
- There is no tracked Bias information for the sources covering this story.
Factuality
To view factuality data please Upgrade to Premium