Instructor Tracing
Launch Phoenix
Install
pip install openinference-instrumentation-instructor instructorBe sure you also install the OpenInference library for the underlying model you're using along with Instructor. For example, if you're using OpenAI calls directly, you would also add: openinference-instrumentation-openai
Setup
Connect to your Phoenix instance using the register function.
from phoenix.otel import register
# configure the Phoenix tracer
tracer_provider = register(
project_name="my-llm-app", # Default is 'default'
auto_instrument=True # Auto-instrument your app based on installed OI dependencies
)Run Instructor
From here you can use instructor as normal.
Observe
Now that you have tracing setup, all invocations of your underlying model (completions, chat completions, embeddings) and instructor triggers will be streamed to your running Phoenix for observability and evaluation.
Last updated
Was this helpful?