Ministral 3B

Smallest Ministral model for efficient edge deployment.

ministral-3b-2512
STABLEGet Started
131,072 context
Starting at $0.10/M input tokens
Starting at $0.10/M output tokens
Streaming
Vision
JSON Output

Select Provider

All Providers for Ministral 3B

LLM Gateway routes requests to the best providers that are able to handle your prompt size and parameters.

Mistral AI
Context: 131.1k
Input
$0.1
/M tokens
Cached
/M tokens
Output
$0.1
/M tokens
Get Started