Qwen3-235B-A22B-Thinking-2507 is a powerful open-source MoE model (22B active) built for deep reasoning. It achieves SOTA results on agentic tasks, supports a 256K context, and is available on Hugging Face and via API.
Qwen3 is the newest family of open-weight LLMs (0.6B to 235B MoE) from Alibaba. Features switchable "Thinking Mode" for reasoning vs. speed. Strong performance on code/math. Multilingual.