MiniMax M2

MiniMax M2 model with reasoning and tool support.

minimax-m2
STABLEGet Started
196,608 context
Starting at $0.17/M (30% off) input tokens
Starting at $0.70/M (30% off) output tokens
Streaming
Tools
Reasoning
JSON Output

Select Provider

All Providers for MiniMax M2

LLM Gateway routes requests to the best providers that are able to handle your prompt size and parameters.

MiniMax
Context: 196.6k
Input
$0.2
/M tokens
Cached
$0.03
/M tokens
Output
$1
/M tokens
Get Started
CanopyWave
Context: 196.6k30% off
Deactivated since Jan 1, 2026
Input
$0.25$0.175
/M tokens
Cached
/M tokens
Output
$1$0.7
/M tokens
Get Started