You’re already calling OpenAI, Anthropic, Bedrock, etc. from your application. You want to add governance — budgets, fallback, policy-rule patterns, per-engineer attribution — without rewriting everything. This cookbook shows the minimum-effort migration.Documentation Index
Fetch the complete documentation index at: https://langwatch.ai/docs/llms.txt
Use this file to discover all available pages before exploring further.
Before
After
Pre-migration checklist
- LangWatch project provisioned (you already have one for tracing — reuse it).
- API token with
gatewayProviders:create+virtualKeys:create. Use the LangWatch UI → Settings → API Tokens. - Access to the upstream provider accounts (you’ll rebind their existing keys inside LangWatch).
Step 1 — Register the providers you already use
For each provider (OpenAI / Anthropic / Bedrock / Azure / Vertex / Gemini), create a gateway provider binding that wraps the same upstream key you’re already using:gpc_* ids — you need them in the next step.
If you’ve never configured providers in LangWatch before, do it first in the UI (Settings → Model Providers). The CLI above needs the
mp_* id of an already-configured provider row.Step 2 — Mint your first virtual key
Step 3 — Flip the env vars in your app
Dev/staging first:Verify the migration is live
X-LangWatch-Request-Id, X-LangWatch-Trace-Id, X-LangWatch-Span-Id. Paste the request id into the LangWatch search bar — the full trace is already there.
Step 4 — Add your first policy
The whole point of the migration. Pick the policy that maps to a real pain you have today:Hard cap on engineering spend
Automatic failover when OpenAI is flaky
Edit the VK via the UI (Gateway → Virtual Keys → edit — once the VK edit drawer is live; until then use themodel_aliases JSON on create):
gpt-5-mini.
Block destructive tools on engineer VKs
shell.exec("rm -rf ~") get 403 tool_not_allowed before they reach the model — the model never generates the destructive call path in the first place.
Step 5 — Wire trace propagation
If you were already using LangWatch SDKs for tracing, propagate the trace id into the gateway so requests nest under your existing trace (no double-cost-attribution):Step 6 — Rotate the original upstream keys
After a week of running through the gateway, the original upstream keys (sk-proj-..., sk-ant-...) should no longer be used by any application. Rotate them in the provider console — this is the only time you need to touch upstream again. The gateway still has access via its own encrypted copies of the old keys (fetched from the LangWatch control plane), and will continue to serve traffic uninterrupted.
Troubleshooting
| Symptom | Cause | Fix |
|---|---|---|
401 invalid_api_key on every request | VK env var not loaded; or VK revoked | echo $LANGWATCH_VK to verify; check LangWatch UI for VK status |
403 model_not_allowed on a model you used before | VK’s models_allowed is set restrictively | Edit VK to add the model, or remove the allowlist |
| Calls succeed but no trace in LangWatch UI | Project-id mismatch between VK and where you’re looking | langwatch virtual-keys get <id> → check project_id matches the project you’re viewing |
Anthropic-side calls return 400 must provide max_tokens | You removed max_tokens because OpenAI doesn’t require it | Anthropic requires it; keep it set |
| Response latency jumped by ~50 ms | Fallback is running every call (primary is down) | Check X-LangWatch-Fallback-Count — if > 0, fix the primary |
What NOT to change
- Request bodies: the gateway’s whole point is byte-for-byte passthrough. Don’t rewrite your payloads.
- SDKs: the official OpenAI / Anthropic SDKs work unchanged. You don’t need the LangWatch SDK for gateway integration (only for trace propagation).
- Streaming handlers: SSE passthrough is byte-identical post-first-chunk. Your streaming code should keep working.
Rollback plan
If the gateway misbehaves, flip the env vars back:See also
- Quickstart — green-field setup without migration.
- CI smoke test cookbook — automate gateway health verification.
- SDK integration — trace propagation recipes.