Add prompt templates & variables
By instrumenting prompt templates and variables, users can experiment with prompt changes in the Arize Prompt Playground. You can directly modify the prompt, variables, and model in the Playground UI to explore how different configurations impact the output. From here, you can test your prompt on a dataset and run evals to see how your changes perform overall.
We provide a using_prompt_template context manager (example below) to add a prompt template to the current OpenTelemetry Context. OpenInference auto-instrumentors will read this Context and pass the prompt template fields as span attributes, following the OpenInference semantic conventions. The interface expects the following:
template
str
"Please describe the best activity for me to do in {city} on {date}"
version
str
"v1.0"
variables
Dict[str]
{"city": "Johannesburg", "date":"July 11"}
pip install openinference-semantic-conventions openinference-instrumentation-openai arize-otel openaifrom openai import OpenAI
from openinference.instrumentation.openai import OpenAIInstrumentor
from openinference.instrumentation import using_prompt_template
from arize.otel import register
# Setup OTEL via our convenience function.
tracer_provider = register(
space_id= "", # your Arize Space ID
api_key= "", # your Arize API Key
project_name= "" # your Arize project name
)from openinference.instrumentation import using_prompt_template
client = OpenAI()
prompt_template = "Please describe the best activity for me to do in {city} on {date}"
prompt_template_variables = {"city": "Johannesburg", "date":"July 11"}
with using_prompt_template(
template=prompt_template,
variables=prompt_template_variables,
version="v1.0",
):
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{
"role": "user",
"content": prompt_template.format(**prompt_template_variables)},
]
)We provide a setPromptTemplate function which allows you to set a template, version, and variables on context. You can use this utility in conjunction with context.with to set the active context. OpenInference auto instrumentations will then pick up these attributes and add them to any spans created within the context.with callback. The components of a prompt template are:
template
string
"Please describe the best activity for me to do in {{city}}"
version
string
"v1.0"
variables
Record<string, unknown>
{ city : "Johannesburg" }
npm install --save @arizeai/openinference-core @opentelemetry/apiAll of these are optional. Application of variables to a template will typically happen before the call to an llm and may not be picked up by auto instrumentation. So, this can be helpful to add to ensure you can see the templates and variables while troubleshooting.
import { context } from "@opentelemetry/api"
import { setSession } from "@openinference-core"
context.with(
setPromptTemplate(
context.active(),
{
template: "Please describe the best activity for me to do in {{city}}",
variables: { city : "Johannesburg" },
version: "v1.0"
}
),
() => {
// Calls within this block will generate spans with the attributes:
// "llm.prompt_template.template" = "Please describe the best activity for me to do in {{city}}"
// "llm.prompt_template.version" = "v1.0"
// "llm.prompt_template.variables" = '{ city : "Johannesburg" }'
}
)Last updated
Was this helpful?

