Skip to main content

Documentation Index

Fetch the complete documentation index at: https://langwatch.ai/docs/llms.txt

Use this file to discover all available pages before exploring further.

Quick setup? Instead of following these steps manually, copy a prompt into your coding agent and it will set this up for you automatically.
LangWatch integrates with OpenAI to automatically capture detailed information about your LLM calls.

Installation

pip install langwatch openai

Usage

The LangWatch API key is configured by default via the LANGWATCH_API_KEY environment variable.
Use autotrack_openai_calls() to automatically capture all OpenAI calls made with a specific client instance within a trace.
import langwatch
from openai import OpenAI

langwatch.setup()
client = OpenAI()


@langwatch.trace(name="OpenAI Chat Completion")
def get_openai_chat_response(user_prompt: str):
    langwatch.get_current_trace().autotrack_openai_calls(client)

    response = client.chat.completions.create(
        model="gpt-5",
        messages=[{"role": "user", "content": user_prompt}],
    )
    completion = response.choices[0].message.content
    return completion


if __name__ == "__main__":
    user_query = "Tell me a joke"
    response = get_openai_chat_response(user_query)

    print(f"User: {user_query}")
    print(f"AI: {response}")
The @langwatch.trace() decorator creates a parent trace, and autotrack_openai_calls() enables automatic tracking of all calls made with the specified client instance for the duration of that trace.