InfoQ Homepage Large Concept Models Content on InfoQ
News
RSS Feed-
Kimi's K2 Opensource Language Model Supports Dynamic Resource Availability and New Optimizer
Kimi released K2, a Mixture-of-Experts large language model with 32 billion activated parameters and 1.04 trillion total parameters, trained on 15.5 trillion tokens. The release introduces MuonClip, a new optimizer that builds on the Muon optimizer by adding a QK-clip technique designed to address training instability, which the team reports resulted in "zero loss spike" during pre-training.