← Models

VAETKI 112B-A10B (MoE)

Hugging FaceJanuary 06, 2026NC-AI-consortium-VAETKI/VAETKIView on Hugging Face
VAETKI 112B-A10B (MoE) thumbnail

VAETKI is a large multilingual language model released by the NC-AI consortium (a collaboration across 13 organizations). The headline spec is a Mixture-of-Experts setup: ~112B total parameters with ~10B active per token, which is a common tradeoff for getting “big model” capacity without paying full dense-model compute at inference time.

It’s also unusually explicit about being built for both research and real deployment. The model is published under an MIT license, supports Korean/English/Chinese/Japanese, and advertises a 32k context window. The README also describes a “thinking mode” vs. “non-thinking mode” behavior (with non-thinking used for tool-agent style tasks). If you’re evaluating open LLMs for agent scaffolding or multilingual apps, this is worth a look simply because it comes with clearer architecture details than most newly uploaded repos.

Quick stats from the listing feed: pipeline: text-generation · 41 likes · 2614 downloads.

View on Hugging Face

Source listing: https://huggingface.co/models?sort=modified