Get access to our best features
Get access to our best features
Published 1 month ago

Starting the Era of 1-bit LLMs - With Microsoft Research

Summary by Ground News
The work was done by Microsoft Research and Chinese Academy of Science researchers. They trained a BitNet b1.58 model with 2T (2 Trillion) tokens following the data recipe of StableLM-3B. The model not only used up to 7 times less memory but were also up to 4 times faster on latency.

0 Articles

All
Left
Center
Right
Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe
Ground News Article Assistant
Not enough coverage to generate an Article Assistant.

Bias Distribution

  • There is no tracked Bias information for the sources covering this story.
Factuality

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

Sources are mostly out of (0)