Huawei Cloud Unit Claims AI Model Breakthrough with New Training Method
6 Articles
6 Articles
Huawei cloud unit claims AI model breakthrough with new training method
Huawei's cloud division said its Pangu large language model achieved a breakthrough in training architecture with a new "Mixture of Group Experts" technology that outperforms competing methods in efficiency and resource allocation.
In terms of smart and sporty electronic devices, Huawei’s are unparalleled. Its operating system is constantly improving, and its updated sensors and algorithms measure more than 60 health and fitness parameters. The last to reach the family is the wearable Huawei Watch 5, which enters directly into the segment [...] La entrada The intelligent revolution aparece primero en Forbes España.
César Funes, vice president of public affairs for Latin America of Huawei, said in an interview with DPL News that the company had already glimpsed for at least five or six years the arrival of Artificial Intelligence (IA) in the networks, which could be used for the automation of its operations under the concept of Autonomous Driving Network. In that sense, they focus on two aspects: the improvement of the efficiency of the use of the resources…
Huawei continues to advance in the development of artificial intelligence in El Salvador with the presentation of customized digital avatars, able to operate as virtual assistants in various environments such as universities, airports, shops and libraries. The technology company demonstrated how these digital agents, driven by Huawei Cloud's cloud, can adapt to specific needs [...] The post Huawei presents in El Salvador virtual assistants with …
Huawei adopts hybrid approach in bid to outperform rival LLMs
A Huawei AI team claimed in a research paper its latest large language model (LLM) makes use of a new hybrid technique, powered by its in-house designed AI chips, to improve the training method first used by DeepSeek. The team developed what they call a mixture of grouped experts (MoGE), a hybrid approach to overcome limitations in the mixture of experts (MoE) technique. The latter was adopted by DeepSeek to develop models using lower cost chips…
Coverage Details
Bias Distribution
- 100% of the sources are Center
To view factuality data please Upgrade to Premium