Ministral 14B

Largest Ministral model optimized for edge and local deployment.

ministral-14b-2512
STABLEGet Started
262,144 context
Starting at $0.20/M input tokens
Starting at $0.20/M output tokens
Streaming
Vision
JSON Output

Select Provider

All Providers for Ministral 14B

LLM Gateway routes requests to the best providers that are able to handle your prompt size and parameters.

Mistral AI
Context: 262.1k
Input
$0.2
/M tokens
Cached
/M tokens
Output
$0.2
/M tokens
Get Started