Hunyuan-A13B

Powerful MoE model, lightweight & open

4 followers

Hunyuan-A13B is Tencent's new open-source MoE model (13B active parameters). It delivers top-tier performance with low computational cost, supports a 256K context window, and includes a thinking mode.
Hunyuan-A13B gallery image
Hunyuan-A13B gallery image
Hunyuan-A13B gallery image
Hunyuan-A13B gallery image
Hunyuan-A13B gallery image
Hunyuan-A13B gallery image
Free
Launch Team
Auth0
Auth0
Start building with Auth0 for AI Agents, now generally available.
Promoted

What do you think? …

Zac Zuo

Hi everyone!

Tencent has a new addition to their Hunyuan family, an open-source model called Hunyuan-A13B. It's aimed at solving a big problem: the high cost of deploying powerful AI.

It’s a Mixture-of-Experts (MoE) model with 13B active parameters, which makes it super efficient. You can run it on a single mid-range GPU, which is great news for individual developers and smaller teams. And it still keeps up with many bigger models, plus it has a 256K context window and a thinking mode.

With the release of A13B, the Hunyuan LLM family lineup is now very clear: Hunyuan-TurboS for speed (fast thinking), Hunyuan-T1 for depth, and now Hunyuan-A13B for hybrid reasoning, which is also open source.

It's cool to see Tencent pushing hard on LLMs, especially since they're already so good at multimodal stuff (their Hunyuan3D is phenomenal!). More competition like this is always a good thing for builders and developers.