Trace prompt templates & variables

By instrumenting the prompt template, users can take full advantage of the Arize prompt playground. You don't need to deploy a new template version in order to see if prompt text or prompt variables changes have the intended effect. Instead, you can experiment with these changes in the playground UI.

We provide a using_prompt_template context manager (example below) to add a prompt template to the current OpenTelemetry Context. OpenInference auto-instrumentors will read this Context and pass the prompt template fields as span attributes, following the OpenInference semantic conventions. The interface expects the following:

Param
Type
Example

template

str

"Please describe the weather forecast for {city} on {date}"

version

str

"v1.0"

variables

Dict[str]

{"city": "Johannesburg", "date":"July 11"}

Refer to the code below for a working example:

Tutorial that adds tracing to prompt template and variables and logs the traces to the Arize platform.
pip install -qq opentelemetry-api opentelemetry-sdk openinference-semantic-conventions openinference-instrumentation-openai opentelemetry-exporter-otlp arize-otel openai
import os
from getpass import getpass
import openai
import opentelemetry
from arize.otel import register
from openai import OpenAI
from openinference.instrumentation import using_prompt_template
from openinference.instrumentation.openai import OpenAIInstrumentor


os.environ["OPENAI_API_KEY"] = getpass("Enter your Open AI API key: ")

my_first_model = "my first model"

# Setup OTel via our convenience function
tracer_provider = register(
    space_id = "your-space-id", # in app space settings page
    api_key = "your-api-key", # in app space settings page
    project_name = "your-project-name", # name this to whatever you would like
)
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)

# Setup OpenAI
client = OpenAI()

prompt_template = "Please describe the weather forecast for {city} on {date}"
prompt_template_variables = {"city": "Johannesburg", "date":"July 11"}
with using_prompt_template(
    template=prompt_template,
    variables=prompt_template_variables,
    version="v1.0",
    ):
    response = client.chat.completions.create(
      model="gpt-4o-mini",
      messages=[
          {
              "role": "user",
              "content": prompt_template.format(**prompt_template_variables)},
        ]
    )

Last updated

Was this helpful?