Tracing

Tracing the execution of LLM applications

Tracing is a powerful tool for understanding how your LLM application works. It helps you track down issues like application latency, runtime exceptions, incorrect prompts, poor retrieval, and more.

Key features

  1. Setup tracing with just a few lines of code

  2. Trace manually to customize your tracing setup

  3. Add metadata to tie all your data together

  4. Query traces to find problematic traces and spans

Arize is built on top of open source packages OpenTelemetry and OpenInference. We have native support for the OpenTelemetry protocol, which is vendor-agnostic, open source, and highly performant. It includes batch processing that can handle billions of traces and spans. OpenInference standardizes traces and spans data across models, frameworks, tool calls, prompts, retrievers, and more.

It takes only a few lines of code to setup tracing using Arize with our auto-instrumentation, and it's flexible enough to add your own metadata and customize your spans.

Learn more

Last updated

Was this helpful?