LlamaIndex

Tutorial on instrumenting a LlamaIndex application and sending traces to Arize
See here for more tutorials

Arize has first-class support for LlamaIndex applications. After instrumentation, you will have a full trace of every part of your LLM application, including input, embeddings, retrieval, functions, and output messages.

We follow a standardized format for how a trace data should be structured using openinference, which is our open source package based on OpenTelemetry. The package we are using is arize-otel, which is a lightweight convenience package to set up OpenTelemetry and send traces to Arize.

Use our code block below to get started using our LlamaIndexInstrumentor.

Phoenix supports LlamaIndex's latest instrumentation paradigm.

To get started, pip install the following.

pip install llama-index openinference-instrumentation-llama-index arize-otel

The following code snippet showcases how to automatically instrument your LLM application.

# Import open-telemetry dependencies
from arize.otel import register

# Setup OTel via our convenience function
tracer_provider = register(
    space_id = "your-space-id", # in app space settings page
    api_key = "your-api-key", # in app space settings page
    project_name = "your-project-name", # name this to whatever you would like
)

# Import the automatic instrumentor from OpenInference
from openinference.instrumentation.llama_index import LlamaIndexInstrumentor

# Finish automatic instrumentation
LlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider)

Now start asking questions to your LLM app and watch the traces being collected by Arize. For more in-detail demonstration, check our Colab tutorial:

Last updated

Was this helpful?