Documentation Index
Fetch the complete documentation index at: https://langwatch.ai/docs/llms.txt
Use this file to discover all available pages before exploring further.
Quick setup? Instead of following these steps manually, copy a prompt into your coding agent and it will set this up for you automatically.
LangWatch integrates with Mastra through OpenTelemetry to capture traces from your Mastra agents automatically.
Installation
npm i langwatch @mastra/core @ai-sdk/openai @mastra/observability @mastra/otel-exporter @mastra/libsql
Usage
Configure your Mastra instance with OpenTelemetry exporter pointing to LangWatch:
import { Agent } from "@mastra/core/agent";
import { Mastra } from "@mastra/core";
import { openai } from "@ai-sdk/openai";
import { Observability } from "@mastra/observability";
import { OtelExporter } from "@mastra/otel-exporter";
import { LibSQLStore } from "@mastra/libsql";
export const mastra = new Mastra({
agents: {
assistant: new Agent({
name: "assistant",
instructions: "You are a helpful assistant.",
model: openai("gpt-5-mini"),
}),
},
storage: new LibSQLStore({ id: "mastra-storage", url: "file:./mastra.db" }),
observability: new Observability({
configs: {
langwatch: {
serviceName: "<project_name>",
exporters: [
new OtelExporter({
provider: {
custom: {
endpoint: "https://app.langwatch.ai/api/otel/v1/traces",
headers: { "Authorization": `Bearer ${process.env.LANGWATCH_API_KEY}` },
},
},
}),
],
},
},
}),
});
Mastra automatically sends traces to LangWatch through the OpenTelemetry exporter. All agent interactions, tool calls, and workflow executions will be captured.