Poe API
DeepSeek-V3.1
By
DeepSeek-V3.1 is post-trained on the top of DeepSeek-V3.1-Base, which is built upon the original V3 base checkpoint through a two-phase long context extension approach, following the methodology outlined in the original DeepSeek-V3 report. We have expanded our dataset by collecting additional long documents and substantially extending both training phases. The 32K extension phase has been increased 10-fold to 630B tokens, while the 128K extension phase has been extended by 3.3x to 209B tokens. Additionally, DeepSeek-V3.1 is trained using the UE8M0 FP8 scale data format to ensure compatibility with microscaling data formats.
Powered by a server managed by @fireworksai. Learn more
- OFFICIAL
Build with DeepSeek-V3.1 using the Poe API
Start by creating an API key, for use with any bot on Poe:
See the full documentation for comprehensive guidance on getting started.
More from Fireworks AI
New
Qwen3.5-397B-FW
New
Minimax-M2.7-FW
New
Gemma-4-31B-FW
GLM-5.1-FW
GLM-5-FWAI
Kimi-K2.5-FW
GLM-4.7-FW
Kimi-K2-Thinking-FW
MiniMax-M2-FW
OpenAI-GPT-OSS-20B