Integrate LangWatch into your Go application to start observing your LLM interactions. This guide covers the setup and basic usage of the LangWatch Go SDK, which is built on top of OpenTelemetry to provide powerful, vendor-neutral tracing.
Protip: wanna to get started even faster? Copy our
llms.txt and ask an AI to do this integration
Prerequisites
Before you begin, ensure you have:
- Go 1.19 or later installed on your system
- A LangWatch account at app.langwatch.ai
- An OpenAI API key (or other LLM provider key)
- Basic familiarity with Go and OpenTelemetry concepts
If you’re new to OpenTelemetry, don’t worry! The LangWatch SDK handles most of the complexity for you. You only need to understand the basic concepts of traces and spans.
Setup
Get started in just a few minutes by installing the SDK and instrumenting your application.
Get your LangWatch API Key
Sign up at app.langwatch.ai and find your API key in your project settings. Set it as an environment variable:export LANGWATCH_API_KEY="your-langwatch-api-key"
export OPENAI_API_KEY="your-openai-api-key"
You can verify your API key is set by running echo $LANGWATCH_API_KEY
Install SDK Packages
Add the required dependencies to your Go module:go get github.com/langwatch/langwatch/sdk-go
go get github.com/langwatch/langwatch/sdk-go/instrumentation/openai
Verify installation by running go mod tidy and checking that all dependencies are resolved.
Configure the LangWatch Exporter
Set up the LangWatch exporter in your application initialization:func setupLangWatch(ctx context.Context) func(context.Context) {
// Create LangWatch exporter - reads LANGWATCH_API_KEY from environment
exporter, err := langwatch.NewDefaultExporter(ctx)
if err != nil {
log.Fatalf("failed to create LangWatch exporter: %v", err)
}
tp := sdktrace.NewTracerProvider(sdktrace.WithBatcher(exporter))
otel.SetTracerProvider(tp)
return func(ctx context.Context) {
if err := tp.Shutdown(ctx); err != nil {
log.Printf("Error shutting down tracer provider: %v", err)
}
}
}
Critical: Always call the shutdown function! The defer shutdown(ctx) call is essential. Without it, traces buffered in memory will be lost when your application exits. The shutdown function flushes all pending traces to LangWatch and releases resources. Never skip this step, especially in short-lived applications like CLI tools or serverless functions.
Instrument Your OpenAI Client
Add the LangWatch middleware to your OpenAI client:client := openai.NewClient(
oaioption.WithAPIKey(os.Getenv("OPENAI_API_KEY")),
oaioption.WithMiddleware(otelopenai.Middleware("my-app",
otelopenai.WithCaptureInput(),
otelopenai.WithCaptureOutput(),
)),
)
The middleware automatically captures all OpenAI API calls, including streaming responses, token usage, and model information.
Create Your First Trace
Start a root span to capture your LLM interaction:tracer := langwatch.Tracer("my-app")
ctx, span := tracer.Start(ctx, "ChatWithUser")
defer span.End()
// Your OpenAI call remains unchanged!
response, err := client.Chat.Completions.New(ctx, openai.ChatCompletionNewParams{
Model: openai.ChatModelGPT4oMini,
Messages: []openai.ChatCompletionMessageParamUnion{
openai.SystemMessage("You are a helpful assistant."),
openai.UserMessage("Hello, OpenAI!"),
},
})
View your traces at app.langwatch.ai within seconds of making your first API call.
That’s it! 🎉 You’ve successfully integrated LangWatch into your Go application. Traces will now be sent to your LangWatch project, providing you with immediate visibility into your LLM interactions.
Complete Working Example
Here’s a minimal working example that combines all the setup steps:
package main
import (
"context"
"log"
"os"
langwatch "github.com/langwatch/langwatch/sdk-go"
otelopenai "github.com/langwatch/langwatch/sdk-go/instrumentation/openai"
"github.com/openai/openai-go"
oaioption "github.com/openai/openai-go/option"
"go.opentelemetry.io/otel"
sdktrace "go.opentelemetry.io/otel/sdk/trace"
)
func main() {
ctx := context.Background()
// 1. Set up LangWatch tracing (just once in your app)
shutdown := setupLangWatch(ctx)
defer shutdown(ctx) // Critical: ensures traces are flushed before exit
// 2. Add the middleware to your OpenAI client
client := openai.NewClient(
oaioption.WithAPIKey(os.Getenv("OPENAI_API_KEY")),
oaioption.WithMiddleware(otelopenai.Middleware("my-app",
otelopenai.WithCaptureInput(),
otelopenai.WithCaptureOutput(),
)),
)
// 3. Create a root span for your operation
tracer := langwatch.Tracer("my-app")
ctx, span := tracer.Start(ctx, "ChatWithUser")
defer span.End()
// Your OpenAI call remains unchanged!
response, err := client.Chat.Completions.New(ctx, openai.ChatCompletionNewParams{
Model: openai.ChatModelGPT4oMini,
Messages: []openai.ChatCompletionMessageParamUnion{
openai.SystemMessage("You are a helpful assistant."),
openai.UserMessage("Hello, OpenAI!"),
},
})
if err != nil {
log.Fatalf("Chat completion failed: %v", err)
}
log.Printf("Response: %s", response.Choices[0].Message.Content)
}
func setupLangWatch(ctx context.Context) func(context.Context) {
// NewDefaultExporter reads LANGWATCH_API_KEY from environment
// and automatically excludes HTTP request spans
exporter, err := langwatch.NewDefaultExporter(ctx)
if err != nil {
log.Fatalf("failed to create LangWatch exporter: %v", err)
}
tp := sdktrace.NewTracerProvider(sdktrace.WithBatcher(exporter))
otel.SetTracerProvider(tp)
return func(ctx context.Context) {
// This flushes all pending traces - never skip this!
if err := tp.Shutdown(ctx); err != nil {
log.Printf("Error shutting down tracer provider: %v", err)
}
}
}
Core Concepts
The Go SDK is designed to feel familiar to anyone who has used OpenTelemetry. It provides a thin wrapper to simplify LangWatch-specific functionality.
- Each message triggering your LLM pipeline as a whole is captured with a Trace.
- A Trace contains multiple Spans, which are the steps inside your pipeline.
- Traces can be grouped into a conversation by assigning a common
thread_id.
- Use User ID to track individual user interactions.
- Apply Labels for custom categorization and filtering.
Creating a Trace
A trace represents a single, end-to-end task, like handling a user request. You create a trace by starting a “root” span. All other spans created within its context will be nested under it.
Before creating traces, ensure you have configured the SDK as shown in the Setup Guide.
For complete API documentation of the Tracer() and LangWatchSpan methods, see the Core SDK section in the reference.
import (
"context"
"github.com/langwatch/langwatch/sdk-go"
)
func handleMessage(ctx context.Context, userMessage string) {
// 1. Get a tracer
tracer := langwatch.Tracer("my-app")
// 2. Start the root span for the trace
ctx, span := tracer.Start(ctx, "HandleUserMessage")
defer span.End() // Important: always end the span
// 3. (Optional) Add metadata
span.SetThreadID("conversation-123")
span.RecordInputString(userMessage)
// ... Your business logic ...
// 4. (Optional) Record the final output
span.RecordOutputString("This was the AI's response.")
}
Creating Nested Spans
To instrument specific parts of your pipeline (like a RAG query or a tool call), create nested spans within an active trace.
import (
"github.com/langwatch/langwatch/sdk-go"
"go.opentelemetry.io/otel/attribute"
)
func retrieveDocuments(ctx context.Context, query string) {
// Assumes a trace has already been started in the parent context `ctx`
tracer := langwatch.Tracer("retrieval-logic")
_, span := tracer.Start(ctx, "RetrieveDocumentsFromVectorDB")
defer span.End()
span.SetType(langwatch.SpanTypeRetrieval)
span.RecordInputString(query)
// ... logic to retrieve documents ...
// You can add custom attributes to any span
span.SetAttributes(attribute.String("db.vendor", "pinecone"))
}
The context ctx is crucial. It carries the active span information, ensuring that tracer.Start() correctly creates a nested span instead of a new trace.
Integrations
LangWatch offers integrations with popular OpenAI, and any other OpenAI-compatible providers:
See the dedicated guides for more details:
- Anthropic - Claude models via OpenAI-compatible API
- OpenAI - GPT models and OpenAI API
- Azure OpenAI - Azure-hosted OpenAI models
- Groq - High-speed inference with Groq’s OpenAI-compatible endpoint
- Grok (xAI) - Trace xAI’s Grok models with the OpenAI-compatible middleware
- Google Gemini - Google’s Gemini models
- Ollama - Local model inference
- OpenRouter - Multi-provider model access
Environment Variables
The SDK respects these environment variables for configuration:
| Variable | Description |
|---|
LANGWATCH_API_KEY | Required. Your LangWatch project API key. |
LANGWATCH_ENDPOINT | The LangWatch collector endpoint. Defaults to https://app.langwatch.ai. |
Filtering Spans
The LangWatch SDK provides powerful filtering capabilities to control which spans are exported. This is useful for reducing noise, excluding sensitive data, or focusing on specific instrumentation.
Using Preset Filters
The SDK includes convenient preset filters for common use cases:
// NewDefaultExporter automatically excludes HTTP request spans
exporter, err := langwatch.NewDefaultExporter(ctx)
// Or use NewExporter with explicit filters
exporter, err := langwatch.NewExporter(ctx,
langwatch.WithFilters(
langwatch.ExcludeHTTPRequests(), // Exclude GET, POST, etc. spans
langwatch.LangWatchOnly(), // Keep only LangWatch instrumentation
),
)
Custom Filtering
Create custom filters using Include() or Exclude() with matching criteria:
exporter, err := langwatch.NewExporter(ctx,
langwatch.WithFilters(
// Only include spans from specific scopes
langwatch.Include(langwatch.Criteria{
ScopeName: []langwatch.Matcher{
langwatch.StartsWith("github.com/langwatch/"),
langwatch.StartsWith("my-app/"),
},
}),
// Exclude database spans
langwatch.Exclude(langwatch.Criteria{
SpanName: []langwatch.Matcher{
langwatch.StartsWith("database."),
},
}),
),
)
Matcher Types
The SDK provides several matcher types:
| Matcher | Description |
|---|
Equals(s) | Exact string match |
EqualsIgnoreCase(s) | Case-insensitive exact match |
StartsWith(prefix) | Prefix match |
StartsWithIgnoreCase(prefix) | Case-insensitive prefix match |
MatchRegex(re) | Regular expression match |
MustMatchRegex(pattern) | Regex match (panics on invalid pattern) |
Filters use AND semantics when combined - a span must pass all filters to be exported. Within a Criteria, matchers use OR semantics - a span matches if it matches any of the matchers in the list.
For detailed filter API documentation and more examples, see the Filtering section in the API reference.
Features
- 🔗 Seamless OpenTelemetry integration - Works with your existing OTel setup
- 🚀 OpenAI instrumentation - Automatic tracing for OpenAI API calls
- 🌐 Multi-provider support - OpenAI, Anthropic, Azure, local models, and more
- 📊 Rich LLM telemetry - Capture inputs, outputs, token usage, and model information
- 🔍 Specialized span types - LLM, Chain, Tool, Agent, RAG, and more
- 🧵 Thread support - Group related LLM interactions together
- 📝 Custom input/output recording - Fine-grained control over what’s captured
- 🔄 Streaming support - Real-time capture of streaming responses
Since LangWatch is built on OpenTelemetry, it also supports any library or framework that integrates with OpenTelemetry.
Examples
Explore real working examples to learn different patterns and use cases:
| Example | What It Shows | Description |
|---|
| Simple | Basic OpenAI instrumentation | Simple chat completion with tracing |
| Custom Input/Output | Recording custom data | Fine-grained control over captured data |
| Streaming | Streaming completions | Real-time capture of streaming responses |
| Threads | Grouping conversations | Managing multi-turn conversations |
| RAG | Retrieval patterns | Document retrieval and context tracking |
For comprehensive code examples including RAG pipelines, error handling, and best practices, see the Complete Example section in the API reference.
Troubleshooting
Common Issues
No traces appearing in LangWatch dashboard:
- Verify your
LANGWATCH_API_KEY is set correctly
- Check that you’re calling the shutdown function to flush traces
- Ensure your application is making OpenAI API calls
Import errors:
- Run
go mod tidy to ensure all dependencies are properly resolved
- Verify you’re using Go 1.19 or later
OpenTelemetry configuration errors:
- Check that the LangWatch endpoint URL is correct
- Verify your API key has the correct permissions
Getting Help
If you’re still having issues:
Next Steps