Custom Model

Custom model endpoint with user-provided base URL.

custom
STABLEGet Started
0 context
Starting at Free input tokens
Starting at Free output tokens
Streaming
Vision
Tools
JSON Output

Select Provider

All Providers for Custom Model

LLM Gateway routes requests to the best providers that are able to handle your prompt size and parameters.

LLM Gateway
Context:
Input
/M tokens
Cached
/M tokens
Output
/M tokens
Get Started