MiniMax M2.7

MiniMax M2.7 with stronger reasoning and coding performance for complex tasks.

minimax-m2.7
STABLEGet Started
204,800 context
Starting at $0.30/M input tokens
Starting at $1.20/M output tokens
Streaming
Tools
Reasoning
JSON Output

Select Provider

All Providers for MiniMax M2.7

LLM Gateway routes requests to the best providers that are able to handle your prompt size and parameters.

MiniMax
Context: 204.8k
Input
$0.3
/M tokens
Cached
$0.06
/M tokens
Output
$1.2
/M tokens
Get Started
NovitaAI
Context: 204.8k
Input
$0.3
/M tokens
Cached
$0.06
/M tokens
Output
$1.2
/M tokens
Get Started