Haystack Tracing
Instrument LLM applications built with Haystack
Arize AX provides auto-instrumentation for Haystack applications, allowing you to trace and observe your Haystack Pipelines.
API Key Setup
Before running your application, ensure you have the following environment variables set:
export ARIZE_SPACE_ID="YOUR_ARIZE_SPACE_ID"
export ARIZE_API_KEY="YOUR_ARIZE_API_KEY"
export OPENAI_API_KEY="YOUR_OPENAI_API_KEY" # Needed for the OpenAIGenerator example
# Add other LLM provider API keys if used by Haystack componentsYou can find your Arize Space ID and API Key in your Arize account settings.
Install
Install Haystack, the OpenInference instrumentor for Haystack, Arize OTel, and supporting OpenTelemetry packages:
pip install haystack-ai openinference-instrumentation-haystack arize-otelSetup Tracing
Connect to Arize AX using arize.otel.register and apply the HaystackInstrumentor.
import os
from arize.otel import register
from openinference.instrumentation.haystack import HaystackInstrumentor
# Setup OTel via Arize's convenience function
tracer_provider = register(
space_id=os.getenv("ARIZE_SPACE_ID"), # or directly pass your Space ID
api_key=os.getenv("ARIZE_API_KEY"), # or directly pass your API Key
project_name="my-haystack-app" # Choose a project name
)
# Instrument Haystack
HaystackInstrumentor().instrument(tracer_provider=tracer_provider)
print("Haystack instrumented for Arize.")Run Haystack Example
Here's how to set up and run a simple Haystack Pipeline. The instrumentor will capture traces from this pipeline.
Observe in Arize AX
After running your Haystack Pipeline, traces will be sent to your Arize project. Log in to Arize AX to:
Visualize the execution of your Haystack Pipeline, including each component.
Inspect the inputs, outputs, and parameters of each component.
Analyze latency and identify bottlenecks.
Trace errors and exceptions through the pipeline.
Resources
Haystack Examples on OpenInference GitHub (includes more complex examples like RAG)
Last updated
Was this helpful?

