Skip to main content
Instrument your LLM applications with OpenTelemetry-based tracing. Capture every call to LLMs, retrievers, agents, and tools automatically.

Quick Start

from arize.otel import register
from openinference.instrumentation.openai import OpenAIInstrumentor

# Setup OpenTelemetry
tracer_provider = register(
    space_id="your-space-id",
    api_key="your-api-key",
    project_name="my-llm-app",
)

# Instrument your LLM library
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)

# Your LLM calls are now automatically traced
Supported frameworks: OpenAI, LangChain, LlamaIndex, Anthropic, Bedrock, and more through OpenInference instrumentors. Learn more: Tracing Documentation