Trace manually

Tutorial with Open AI LLM calls, tracing various attributes.
Tutorial notebook for tracing Open AI function calls with manual instrumentation, with a specific focus on function calls. The tutorial also shows how spans can be loaded into the prompt playground for iteration in UI.
Learn more about LLM instrumentation

Arize is built on OTEL as a foundation for LLM Tracing. The platform natively supports collecting traces generated via OpenInference automatic instrumentation. Because we support OpenTelemetry, you also have the option to perform manual instrumentation, no LLM framework required.

Setup tracing

Send data to a specific project

Get the current span context and tracer

Trace prompt templates & variables

Add attributes, metadata and tags

Add events, exceptions and status

Sessions

Configure OTEL tracer

Trace inputs and outputs

Log Outputs

AI Powered Search & Filter

Export Traces

Send Traces from Phoenix -> Arize

Last updated

Was this helpful?