Alibaba Admits Qwen3's Hybrid-Thinking Mode Was Dumb
ZHEJIANG, CHINA, JUL 30 – Alibaba's Qwen3-Coder uses 480 billion parameters with a Mixture of Experts approach but raises security concerns due to China's intelligence law requiring government data access.
4 Articles
4 Articles
Alibaba, Zhipu roll out new AI models amid heated open-source race
Alibaba Group Holding and Zhipu AI have launched new open-source models as China’s rivalry with the US in artificial intelligence heats up. On Tuesday, Alibaba released Wan2.2, which it claimed was the industry’s “first open-source large video generation models incorporating the Mixture-of-Experts (MoE) architecture”. Alibaba owns the South China Morning Post. MoE is a machine-learning approach that divides an AI model into separate sub-networks…
Alibaba and Zhipu Launch Big Models; Focus on Coding and Finance
Major tech players like Alibaba and Zhihui are making waves by unveiling their latest large-scale AI models, signaling a significant push into areas such as programming and finance. These developments highlight the growing competition among top tech firms to lead in artificial intelligence innovation. Alibaba has showcased its newest AI model, emphasizing its capabilities in complex programming tasks and data analysis, aiming to enhance software…
Alibaba’s AI coding tool raises security concerns in the West
Alibaba has released a new AI coding model called Qwen3-Coder, built to handle complex software tasks using a large open-source model. The tool is part of Alibaba’s Qwen3 family and is being promoted as the company’s most advanced coding agent to date. The model uses a Mixture of Experts (MoE) approach, activating 35 billion parameters out of a total 480 billion and supporting up to 256,000… Source
Coverage Details
Bias Distribution
- 100% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium