.instrument(), and every call is traced — no per-call code changes.
You can start from the Arize AX UI — when you create a new tracing project, the setup wizard walks you through choosing your integration and gives you the code to copy:

Set Up with Skills or Code
- By Arize Skills
- By Code
Three steps to instrument with your AI coding agent:Install skillSet up authenticationInstrument your appWorks with Cursor, Claude Code, Codex, and more. The skill analyzes your stack, picks the right OpenInference package, wires it in, and tells you exactly how to verify traces are flowing:

Supported Integrations
Arize has 30+ native integrations across LLM providers, Python and TypeScript agent frameworks, and Java. The most common ones:
OpenAI

Anthropic

LangChain

LangGraph

LlamaIndex

CrewAI

Mastra

Vercel AI SDK
See all 30+ integrations (LLM providers, agent frameworks, Python, TypeScript, Java)
Learn More
- What auto captures — auto-instrumentors set OpenInference semantic conventions automatically: model name, messages, token counts, inputs, outputs.
- Group traces into conversations — add
session.idanduser.idto follow multi-turn interactions. See Set up sessions. - Enrich traces with custom data — attach metadata, tags, and custom attributes to auto-instrumented spans. See Customize your traces.
- Control what’s captured — hide sensitive inputs, suppress tracing for specific calls, or truncate images with
TraceConfig. See Mask and redact data. - Augment what auto didn’t capture — add manual spans for anything auto-instrumentation missed. See Manual instrumentation or Combine auto + manual.