The gateway accepts three ways to reference a model in theDocumentation Index
Fetch the complete documentation index at: https://langwatch.ai/docs/llms.txt
Use this file to discover all available pages before exploring further.
model field of a request. Which to use depends on what your virtual key is bound to, and whether you want a redirect layer in the middle.
Bare
"model": "gpt-5-mini" — preferred for OpenAI-SDK drop-in and single-provider VKs.Prefixed
"model": "openai/gpt-5-mini" — disambiguator for multi-provider VKs.Alias
"model": "coding-small" — per-VK redirect to any provider/model. See Model Aliases.Bare model (preferred)
- Your VK is bound to one provider.
- You’re migrating an existing OpenAI/Anthropic SDK codebase and want zero code changes.
- You’re writing documentation or examples for broad audiences.
Prefixed model (provider/model)
- Your VK is bound to multiple providers AND more than one offers the same model name (e.g. both OpenAI and Azure OpenAI expose
gpt-5-mini). - You want readers of your code to see at a glance which provider a call is targeting.
- You’re porting from a gateway (litellm, OpenRouter) that required this form.
openai, anthropic, azure, bedrock, vertex, gemini, groq, grok, ollama, openrouter, plus any custom-OpenAI-compatible provider name you configured.
Alias
provider/model. Aliases decouple client code from provider specifics — you can repoint coding-small from OpenAI to Bedrock in a single VK edit without touching any caller.
See Model Aliases for the full story, including the :fallback suffix for cross-provider remap and the GET /v1/models discovery endpoint.
How resolution works
The gateway resolves themodel field in constant time on the hot path — no try-the-bare-form-then-fall-back-to-the-prefixed-form retry, no error branch per request. Whichever form you send, one deterministic O(1) resolution step picks the right route.
Concretely: on every request the resolver inspects the raw model string once. If it contains a /, the prefix is peeled off and checked against the VK’s bound providers; the bare segment then identifies the upstream model. If there’s no /, the bare string is looked up in the VK’s alias map (if present) and then against the VK’s bound-provider models. The outgoing request body’s model field is rewritten to the bare form before dispatch so the raw-body passthrough never ships openai/gpt-5-mini to OpenAI (upstreams reject a prefixed name with invalid model ID).
Request model | Resolved route |
|---|---|
gpt-5-mini | openai + gpt-5-mini |
openai/gpt-5-mini | openai + gpt-5-mini (prefix stripped before upstream) |
claude-haiku-4-5-20251001 | anthropic + claude-haiku-4-5-20251001 |
anthropic/claude-haiku-4-5-20251001 | anthropic + claude-haiku-4-5-20251001 (prefix stripped before upstream) |
coding-small | (alias → openai + gpt-5-mini) |
400 bad_request with an enriched error (below).
Ambiguity — rejected at save time
If a VK is bound to multiple providers that expose the same model name (e.g. both OpenAI and Azure OpenAI offergpt-5-mini), the VK cannot be saved without a disambiguating alias. The create/update dialog surfaces an inline error:
Rejecting at save time — not at request time — preserves the runtime contract: every bare name on an active VK resolves unambiguously in a single O(1) lookup. The hot path never has to branch on collision. Prefixed forms on the same VK always resolve unambiguously, sogpt-5-miniis provided by multiple bound providers on this VK (openai,azure). Define a model alias that pins the bare name to one provider, or remove one of the duplicate providers.
"model": "openai/gpt-5-mini" and "model": "azure/gpt-5-mini" always work alongside whatever alias you defined for the bare form.
Model not found
If themodel string matches no map key:
Decision tree
Multi-provider VK, unique model names
Use bare. The map resolves each unique name to the right provider — no ambiguity.
Multi-provider VK, overlapping model names
Use prefixed.
openai/gpt-5-mini vs azure/gpt-5-mini disambiguates. Or define aliases that pin each name to a specific provider.You want a stable client-facing name that can be repointed
Use an alias. Edit the VK once, zero caller changes. See Model Aliases.
Cross-provider fallback (primary dies, secondary takes over)
Alias with
:fallback suffix. "coding-small": "openai/gpt-5-mini" + "coding-small:fallback": "anthropic/claude-haiku-4-5-20251001". See Model Aliases → :fallback suffix.Why not just pick one convention?
Most gateways in the ecosystem (litellm, OpenRouter, Portkey) useprovider/model. Most SDK codebases use the bare name. Requiring either one would create friction for the other population.
Accepting both — with a single O(1) resolution step, not a retry loop — lets you write whichever is natural for your code and the gateway figures it out at zero hot-path cost.