> ## Documentation Index
> Fetch the complete documentation index at: https://langwatch.ai/docs/llms.txt
> Use this file to discover all available pages before exploring further.

# Observability & Tracing

> Monitor, debug, and optimize your LLM applications with comprehensive observability and tracing capabilities

See what's happening inside your LLM applications. LangWatch tracks every interaction, helps you debug issues, and shows you how your AI systems actually work in production.

<Frame>
  <img className="block" src="https://mintcdn.com/langwatch/CYVXWbKFuujxTm5z/images/llm-observability/overview.webp?fit=max&auto=format&n=CYVXWbKFuujxTm5z&q=85&s=7a4b821ffba7245468439d747524247a" alt="LangWatch Observability Dashboard" width="3534" height="1815" data-path="images/llm-observability/overview.webp" />
</Frame>

## Core Features

<CardGroup cols={2}>
  <Card title="Real-time Tracing" description="Watch every LLM call and tool usage as it happens, with full context." icon="chart-network" href="/concepts" horizontal arrow />

  <Card title="User Events" description="See how users actually interact with your AI - thumbs up, selections, custom events." icon="users" href="/user-events/overview" horizontal arrow />

  <Card title="Cost Tracking" description="Know exactly how much each model call costs you, down to the token." icon="dollar-sign" href="/integration/python/tutorials/tracking-llm-costs" horizontal arrow />

  <Card title="Monitor Performance" description="Spot slow calls and bottlenecks before your users complain." icon="gauge" horizontal />

  <Card title="Automations & Alerts" description="Get notified when things go wrong, or when costs spike unexpectedly." icon="bell" href="/features/automations" horizontal arrow />

  <Card title="Embedded Analytics" description="Drop dashboards right into your app so your team can see what's happening." icon="chart-bar" href="/features/embedded-analytics" horizontal arrow />
</CardGroup>

## How it works

Add a few lines to your code and LangWatch starts tracking everything:

1. **Add the SDK** - Drop in a few lines of code to your existing app
2. **We track everything** - Automatically captures all your LLM calls and interactions
3. **See it live** - Watch what's happening in real-time through the dashboard
4. **Debug easily** - Click into any trace to see exactly what went wrong

## Get started

Pick your language and start tracking:

<CardGroup cols={2}>
  <Card title="Python SDK" icon="python" href="/integration/python/guide" horizontal arrow />

  <Card title="TypeScript SDK" icon="code" href="/integration/typescript/guide" horizontal arrow />

  <Card title="Go SDK" icon="golang" href="/integration/go/guide" horizontal arrow />

  <Card title="View All Integrations" icon="plug" href="/integration/overview" horizontal arrow />
</CardGroup>
