Guardrails AI Tracing

Instrument LLM applications using the Guardrails AI framework

Setup Guardrails AI with a DatasetEmbeddingsGuard

This guide helps you set up instrumentation for your Guardrails AI application using OpenInference to send traces to Arize AX.

API Key Setup

Before running your application, ensure you have the following environment variables set:

Sign up for a free Arize account. You can find your Arize Space ID and API Key in your Arize account settings.

Install

Install the necessary packages for Guardrails AI, the OpenInference instrumentor, Arize OTel, and OpenTelemetry:

Setup Tracing

Connect to Arize AX using arize.otel.register and apply the GuardrailsInstrumentor.

Run Guardrails Example

Now you can run your Guardrails AI code as usual. The instrumentor will capture relevant trace data.

Observe in Arize AX

After running your Guardrails AI application, traces will be sent to your Arize project. You can log in to Arize AX to:

  • Visualize the execution flow of your Guardrails, including the steps and validators.

  • Inspect the inputs and outputs of the underlying LLM calls.

  • Analyze latency and errors.

  • Evaluate the effectiveness of your guards.

Resources

Last updated

Was this helpful?