Poe API
Kimi-K2-Instruct
By
Kimi K2 is a state-of-the-art mixture-of-experts (MoE) language model with 32 billion activated parameters and 1 trillion total parameters. Trained with the Muon optimizer, Kimi K2 achieves exceptional performance across frontier knowledge, reasoning, and coding tasks while being meticulously optimized for agentic capabilities. Uses the latest September 5th, 2025 snapshot. The updated version has improved coding abilities, agentic tool use, and a longer (256K) context window.
Powered by a server managed by @fireworksai. Learn more
- OFFICIAL
Build with Kimi-K2-Instruct using the Poe API
Start by creating an API key, for use with any bot on Poe:
See the full documentation for comprehensive guidance on getting started.
More from Fireworks AI
New
GLM-5.1-FW
GLM-5-FWAI
Kimi-K2.5-FW
GLM-4.7-FW
Kimi-K2-Thinking-FW
MiniMax-M2-FW
OpenAI-GPT-OSS-20B
OpenAI-GPT-OSS-120B
Kimi-K2-Instruct
DeepSeek-V3.2-FW