Skip to main content
Quick setup? Instead of following these steps manually, copy a prompt into your coding agent and it will set this up for you automatically.
LangWatch integrates with Langchain to provide detailed observability into your chains, agents, LLM calls, and tool usage.

Installation

npm i langwatch @langchain/openai @langchain/core

Usage

The LangWatch API key is configured by default via the LANGWATCH_API_KEY environment variable.
Use LangWatchCallbackHandler to capture Langchain events as spans within your trace.
import { setupObservability } from "langwatch/observability/node";
import { LangWatchCallbackHandler } from "langwatch/observability/instrumentation/langchain";
import { ChatOpenAI } from "@langchain/openai";
import { HumanMessage } from "@langchain/core/messages";

setupObservability({ serviceName: "<project_name>" });

async function main(message: string): Promise<string> {
  const chatModel = new ChatOpenAI({ model: "gpt-5" }).withConfig({
    callbacks: [new LangWatchCallbackHandler()],
  });

  const result = await chatModel.invoke([new HumanMessage(message)]);
  return result.content as string;
}

console.log(await main("Hey, tell me a joke"));
The LangWatchCallbackHandler captures Langchain events and converts them into detailed LangWatch spans. Pass the callback handler to your Langchain components via the callbacks option in withConfig().