LangWatch, whatever your stack
LangWatch is designed to be the most open and flexible platform for LLM observability that integrates with all the major LLM providers, frameworks, and tools. See a full list of integrations below. LangWatch is based on OpenTelemetry. Use our Python SDK, TypeScript SDK, or Go SDK to log traces to LangWatch. Alternatively, you can also directly use our OpenTelemetry Endpoint from any language.MCP’s
LangWatch MCP
Automatically instrument your code with LangWatch tracing, create and manage prompts, set up evaluations, debug production issues, and more.
SDK’s
LangWatch provides SDKs for several programming languages.Python SDK
Complete Python SDK with automatic instrumentation for popular frameworks
TypeScript SDK
Full-featured TypeScript/JavaScript SDK with type safety
Go SDK
High-performance Go SDK for server-side applications
OpenTelemetry
Native OpenTelemetry integration for any language
Frameworks
Use LangWatch to effortlessly integrate with popular AI frameworksLangChain
LangGraph
Vercel AI SDK

LiteLLM
OpenAI Agents
Pydantic AI
Mastra

DSPy

LlamaIndex

Haystack
Strand Agents

Agno
CrewAI
AutoGen

Semantic Kernel
Spring AI
PromptFlow
Google ADK
Model Providers
Use LangWatch to effortlessly integrate with popular AI model providersOpenAI
Anthropic Claude
Azure OpenAI
Azure AI
Vertex AI
Gemini
AWS Bedrock
Groq
Grok (xAI)

Ollama
OpenRouter
No-Code Platforms
No-code agent builders and toolsOpenClaw
n8n
Flowise
Langflow
Other Official LangWatch Integrations
LangWatch provides several official integrations with other tools and services.REST API
Direct API integration for custom applications
MCP
Model Context Protocol integration