Add attributes, metadata and tags
You can manually track additional application details by adding attributes, metadata, or tags to your spans. This helps capture actions or context not covered by standard frameworks or LLM clients.
Add attributes to a span
Attributes let you attach key/value pairs to a spans so it carries more information about the current operation that it's tracking.
Notice that the attributes have a specific prefix operation. When adding custom attributes, it's best practice to vendor your attributes (e.x. mycompany.) so that your attributes do not clash with semantic conventions.
from opentelemetry import trace
# Add these lines inside the span or function currently being traced
current_span = trace.get_current_span()
current_span.set_attribute("operation.value", 1)
current_span.set_attribute("operation.name", "Saying hello!")
current_span.set_attribute("operation.other-stuff", [1, 2, 3])You can add attributes to spans in Javascript in multiple ways:
tracer.startActiveSpan(
'app.new-span',
{ attributes: { attribute1: 'value1' } },
(span) => {
// do some work...
span.end();
},
);function chat(message: string, user: User) {
return tracer.startActiveSpan(`chat:${i}`, (span: Span) => {
const result = Math.floor(Math.random() * (max - min) + min);
// Add an attribute to the span
span.setAttribute('mycompany.userid', user.id);
span.end();
return result;
});
}// You can set any custom attribute you want
singleAttrSpan.setAttribute("custom_attr", "custom attribute here");
// close the span
singleAttrSpan.end();Add attributes tied to semantic conventions
Semantic Conventions provides a structured schema to represent common LLM application attributes. These are well known names for items like messages, prompt templates, metadata, and more. We've built a set of semantic conventions as part of the OpenInference package.
Setting attributes is crucial for understanding the flow of data and messages through your LLM application, which facilitates easier debugging and analysis. By setting attributes such as OUTPUT_VALUE and OUTPUT_MESSAGES, you can capture essential output details and interaction messages within the context of a span. This allows you to record the response and categorize and store messages exchanged by components in a structured format, which is used in Arize to help you debug your application.
To use OpenInference Semantic Attributes in Python, ensure you have the semantic conventions package:
pip install openinference-semantic-conventionsThen run the following to set semantic attributes:
from openinference.semconv.trace import SpanAttributes
# Add these lines inside the span or function currently being traced
span.set_attribute(SpanAttributes.OUTPUT_VALUE, response)
# Ex: This shows up under `output_messages` tab for an LLM span
span.set_attribute(
f"{SpanAttributes.LLM_OUTPUT_MESSAGES}.0.{MessageAttributes.MESSAGE_ROLE}",
"assistant",
)
span.set_attribute(
f"{SpanAttributes.LLM_OUTPUT_MESSAGES}.0.{MessageAttributes.MESSAGE_CONTENT}",
response,
)First, add both semantic conventions as a dependency to your application:
npm install --save @opentelemetry/semantic-conventions @arizeai/openinfernece-semantic-conventionsAdd the following to the top of your application file:
import { SemanticAttributes } from 'arizeai/openinfernece-semantic-conventions';Finally, you can update your file to include semantic attributes:
const doWork = () => {
tracer.startActiveSpan('app.doWork', (span) => {
span.setAttribute(SemanticAttributes.INPUT_VALUE, 'work input');
// Do some work...
span.end();
});
};For instance, in the chat example from the previous section, we may want to create a span to capture some information about our request before we call out to OpenAI which is auto instrumented using the OpenInference OpenAI package.
/*app.ts*/
import { trace } from '@opentelemetry/api';
import express, { Express } from 'express';
import { OpenAI } from "openai";
import {
MimeType,
OpenInferenceSpanKind,
SemanticConventions,
} from "@arizeai/openinference-semantic-conventions";
const tracer = trace.getTracer('llm-server', '0.1.0');
const PORT: number = parseInt(process.env.PORT || '8080');
const app: Express = express();
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
app.get('/chat', (req, res) => {
const message = req.query.message
// Start a chain span, this will be the parent of all the work done in this route
// including the spans created by the OpenAI auto instrumentation package
tracer.startActiveSpan("chat chain", async (span) => {
span.setAttributes({
[SemanticConventions.OPENINFERENCE_SPAN_KIND]:
OpenInferenceSpanKind.CHAIN,
[SemanticConventions.INPUT_VALUE]: message,
[SemanticConventions.INPUT_MIME_TYPE]: MimeType.TEXT,
// Metadata can be used to store user defined values
[SemanticConventions.METADATA]: JSON.stringify({
"userId": req.query.userId,
"conversationId": req.query.conversationId
})
});
// Will be picked up by auto instrumentation
const chatCompletion = await openai.chat.completions.create({
messages: [{ role: "user", content: message }],
model: "gpt-3.5-turbo",
});
const response = chatCompletion.choices[0].message;
span.setAttributes({
[SemanticConventions.OUTPUT_VALUE]: response,
[SemanticConventions.OUTPUT_MIME_TYPE]: MimeType.TEXT,
[`${SemanticConventions.LLM_OUTPUT_MESSAGES}.0.${SemanticConventions.MESSAGE_CONTENT}`]:
streamedResponse,
[`${SemanticConventions.LLM_OUTPUT_MESSAGES}.0.${SemanticConventions.MESSAGE_ROLE}`]:
role,
});
span.setStatus({ code: SpanStatusCode.OK });
// End the span
span.end();
res.send(response);
})
});
app.listen(PORT, () => {
console.log(`Listening for requests on http://localhost:${PORT}`);
});This example demonstrates how to use a CHAIN span to wrap our LLM span, allowing additional application data to be tracked. This data can then be analyzed in Arize. For more complex applications, different strategies might be required. Refer to the previous section for detailed guidance on creating and nesting spans effectively.
// Use OpenInference semantic conventions to set reserved attributes
singleAttrSpan.setAttribute("openinference.span.kind", "CHAIN");
singleAttrSpan.setAttribute("input.value", input);
singleAttrSpan.setAttribute("output.value", output);Add attributes to multiple spans at once
You can set attributes once to OpenTelemetry Context, and our tracing integrations will attempt to pass these attributes to all other spans underneath a parent trace.
Supported Context Attributes include:
Metadata: Metadata associated with a span.
Tags: List of tags to give the span a category.
Session ID: Unique identifier for a session.
User ID: Unique identifier for a user.
Prompt Template:
Template: Used to generate prompts as Python f-strings.
Version: The version of the prompt template.
Variables: key-value pairs applied to the prompt template.
Here are the functions we support to add attributes to context.
pip install openinference-instrumentationusing_metadata
using_metadataContext manager to add metadata to the current OpenTelemetry Context. OpenInference auto instrumentators will read this Context and pass the metadata as a span attribute, following the OpenInference semantic conventions. Its input, the metadata, must be a dictionary with string keys. This dictionary will be serialized to JSON when saved to the OTEL Context and remain a JSON string when sent as a span attribute.
from openinference.instrumentation import using_metadata
metadata = {
"key-1": value_1,
"key-2": value_2,
...
}
with using_metadata(metadata):
# Calls within this block will generate spans with the attributes:
# "metadata" = "{\"key-1\": value_1, \"key-2\": value_2, ... }" # JSON serialized
...It can also be used as a decorator:
@using_metadata(metadata)
def call_fn(*args, **kwargs):
# Calls within this function will generate spans with the attributes:
# "metadata" = "{\"key-1\": value_1, \"key-2\": value_2, ... }" # JSON serialized
...using_tags
using_tagsContext manager to add tags to the current OpenTelemetry Context. OpenInference auto instrumentators will read this Context and pass the tags as a span attribute, following the OpenInference semantic conventions. ts input, the tag list, must be a list of strings.
from openinference.instrumentation import using_tags
tags = ["tag_1", "tag_2", ...]
with using_tags(tags):
# Calls within this block will generate spans with the attributes:
# "tag.tags" = "["tag_1","tag_2",...]"
...It can also be used as a decorator:
@using_tags(tags)
def call_fn(*args, **kwargs):
# Calls within this function will generate spans with the attributes:
# "tag.tags" = "["tag_1","tag_2",...]"
...using_prompt_template
using_prompt_templateContext manager to add a prompt template (including its version and variables) to the current OpenTelemetry Context. OpenInference auto instrumentators will read this Context and pass the prompt template fields as span attributes, following the OpenInference semantic conventions. Its inputs must be of the following type:
Template: non-empty string.
Version: non-empty string.
Variables: a dictionary with string keys. This dictionary will be serialized to JSON when saved to the OTEL Context and remain a JSON string when sent as a span attribute.
from openinference.instrumentation import using_prompt_template
prompt_template = "Please describe the weather forecast for {city} on {date}"
prompt_template_variables = {"city": "Johannesburg", "date":"July 11"}
with using_prompt_template(
template=prompt_template,
version=prompt_template_variables,
variables="v1.0",
):
# Calls within this block will generate spans with the attributes:
# "llm.prompt_template.template" = "Please describe the weather forecast for {city} on {date}"
# "llm.prompt_template.version" = "v1.0"
# "llm.prompt_template.variables" = "{\"city\": \"Johannesburg\", \"date\": \"July 11\"}" # JSON serialized
...It can also be used as a decorator:
@using_prompt_template(
template=prompt_template,
version=prompt_template_variables,
variables="v1.0",
)
def call_fn(*args, **kwargs):
# Calls within this function will generate spans with the attributes:
# "llm.prompt_template.template" = "Please describe the weather forecast for {city} on {date}"
# "llm.prompt_template.version" = "v1.0"
# "llm.prompt_template.variables" = "{\"city\": \"Johannesburg\", \"date\": \"July 11\"}" # JSON serialized
...using_attributes
using_attributesContext manager to add attributes to the current OpenTelemetry Context. OpenInference auto instrumentators will read this Context and pass the attributes fields as span attributes, following the OpenInference semantic conventions. This is a convenient context manager to use if you find yourself using many of the previous ones in conjunction.
from openinference.instrumentation import using_attributes
tags = ["tag_1", "tag_2", ...]
metadata = {
"key-1": value_1,
"key-2": value_2,
...
}
prompt_template = "Please describe the weather forecast for {city} on {date}"
prompt_template_variables = {"city": "Johannesburg", "date":"July 11"}
prompt_template_version = "v1.0"
with using_attributes(
session_id="my-session-id",
user_id="my-user-id",
metadata=metadata,
tags=tags,
prompt_template=prompt_template,
prompt_template_version=prompt_template_version,
prompt_template_variables=prompt_template_variables,
):
# Calls within this block will generate spans with the attributes:
# "session.id" = "my-session-id"
# "user.id" = "my-user-id"
# "metadata" = "{\"key-1\": value_1, \"key-2\": value_2, ... }" # JSON serialized
# "tag.tags" = "["tag_1","tag_2",...]"
# "llm.prompt_template.template" = "Please describe the weather forecast for {city} on {date}"
# "llm.prompt_template.variables" = "{\"city\": \"Johannesburg\", \"date\": \"July 11\"}" # JSON serialized
# "llm.prompt_template.version " = "v1.0"
...The previous example is equivalent to doing the following, making using_attributes a very convenient tool for the more complex settings.
with (
using_session("my-session-id"),
using_user("my-user-id"),
using_metadata(metadata),
using_tags(tags),
using_prompt_template(
template=prompt_template,
version=prompt_template_version,
variables=prompt_template_variables,
),
):
# Calls within this block will generate spans with the attributes:
# "session.id" = "my-session-id"
# "user.id" = "my-user-id"
# "metadata" = "{\"key-1\": value_1, \"key-2\": value_2, ... }" # JSON serialized
# "tag.tags" = "["tag_1","tag_2",...]"
# "llm.prompt_template.template" = "Please describe the weather forecast for {city} on {date}"
# "llm.prompt_template.variables" = "{\"city\": \"Johannesburg\", \"date\": \"July 11\"}" # JSON serialized
# "llm.prompt_template.version " = "v1.0"
...It can also be used as a decorator:
@using_attributes(
session_id="my-session-id",
user_id="my-user-id",
metadata=metadata,
tags=tags,
prompt_template=prompt_template,
prompt_template_version=prompt_template_version,
prompt_template_variables=prompt_template_variables,
)
def call_fn(*args, **kwargs):
# Calls within this function will generate spans with the attributes:
# "session.id" = "my-session-id"
# "user.id" = "my-user-id"
# "metadata" = "{\"key-1\": value_1, \"key-2\": value_2, ... }" # JSON serialized
# "tag.tags" = "["tag_1","tag_2",...]"
# "llm.prompt_template.template" = "Please describe the weather forecast for {city} on {date}"
# "llm.prompt_template.variables" = "{\"city\": \"Johannesburg\", \"date\": \"July 11\"}" # JSON serialized
# "llm.prompt_template.version " = "v1.0"
...get_attributes_from_context
get_attributes_from_contextOur OpenInference core instrumentation package offers a convenience function, get_attributes_from_context, to read the context attributes set above from OTEL context.
In the following example, we assume the following are set in the OTEL context:
tags = ["tag_1", "tag_2"]
metadata = {
"key-1": 1,
"key-2": "2",
}
prompt_template = "Please describe the weather forecast for {city} on {date}"
prompt_template_variables = {"city": "Johannesburg", "date":"July 11"}
prompt_template_version = "v1.0"We then use get_attributes_from_context to extract them from the OTEL context. You can use it in your manual instrumentation to attach these attributes to your spans.
from openinference.instrumentation import get_attributes_from_context
span.set_attributes(dict(get_attributes_from_context()))
# The span will then have the following attributes attached:
# {
# 'session.id': 'my-session-id',
# 'user.id': 'my-user-id',
# 'metadata': '{"key-1": 1, "key-2": "2"}',
# 'tag.tags': ['tag_1', 'tag_2'],
# 'llm.prompt_template.template': 'Please describe the weather forecast for {city} on {date}',
# 'llm.prompt_template.version': 'v1.0',
# 'llm.prompt_template.variables': '{"city": "Johannesburg", "date": "July 11"}'
# }npm install --save @arizeai/openinference-core @opentelemetry/apiYou can use any of the utilities below in conjunction with context.with to set attributes on the active context. OpenInference auto instrumentations will then pick up these attributes and add them to any spans created within the context.with callback.
setMetadata
We provide a setMetadata function which allows you to set a metadata attributes on context. Metadata attributes will be serialized to a JSON string when stored on context and will be propagated to spans in the same way.
import { context } from "@opentelemetry/api"
import { setMetadata } from "@openinference-core"
context.with(
setMetadata(context.active(), { key1: "value1", key2: "value2" }),
() => {
// Calls within this block will generate spans with the attributes:
// "metadata" = '{"key1": "value1", "key2": "value2"}'
}
)setTags
We provide a setTags function which allows you to set a list of string tags on context. Tags, like metadata, will be serialized to a JSON string when stored on context and will be propagated to spans in the same way.
import { context } from "@opentelemetry/api"
import { setTags } from "@openinference-core"
context.with(
setTags(context.active(), ["value1", "value2"]),
() => {
// Calls within this block will generate spans with the attributes:
// "tag.tags" = '["value1", "value2"]'
}
)setPromptTemplate
We provide a setPromptTemplate function which allows you to set a template, version, and variables on context. The components of a prompt template are:
template - a string with templated variables ex.
"hello {{name}}"variables - an object with variable names and their values ex.
{name: "world"}version - a string version of the template ex.
v1.0
All of these are optional. Application of variables to a template will typically happen before the call to an llm and may not be picked up by auto instrumentation. So, this can be helpful to add to ensure you can see the templates and variables while troubleshooting.
import { context } from "@opentelemetry/api"
import { setPromptTemplate } from "@openinference-core"
context.with(
setPromptTemplate(
context.active(),
{
template: "hello {{name}}",
variables: { name: "world" },
version: "v1.0"
}
),
() => {
// Calls within this block will generate spans with the attributes:
// "llm.prompt_template.template" = "hello {{name}}"
// "llm.prompt_template.version" = "v1.0"
// "llm.prompt_template.variables" = '{ "name": "world" }'
}
)setAttributes
We provide a setAttributes function which allows you to add a set of attributes to context. Attributes set on context using setAttributes must be valid span attribute values.
import { context } from "@opentelemetry/api"
import { setAttributes } from "@openinference-core"
context.with(
setAttributes(context.active(), { myAttribute: "test" }),
() => {
// Calls within this block will generate spans with the attributes:
// "myAttribute" = "test"
}
)You can also use multiple setters at the same time to propagate multiple attributes to the span below. Since each setter function returns a new context, they can be used together as follows.
import { context } from "@opentelemetry/api"
import { setAttributes } from "@openinference-core"
context.with(
setAttributes(
setSession(context.active(), { sessionId: "session-id"}),
{ myAttribute: "test" }
),
() => {
// Calls within this block will generate spans with the attributes:
// "myAttribute" = "test"
// "session.id" = "session-id"
}
)You can also use setAttributes in conjunction with the OpenInference Semantic Conventions to set OpenInference attributes manually.
import { context } from "@opentelemetry/api"
import { setAttributes } from "@openinference-core"
import { SemanticConventions } from "@arizeai/openinference-semantic-conventions";
context.with(
setAttributes(
{ [SemanticConventions.SESSION_ID: "session-id" }
),
() => {
// Calls within this block will generate spans with the attributes:
// "session.id" = "session-id"
}
)getAttributesFromContext
We also provide a utility function: getAttributesFromContext that allows you to pull all of the attributes off of a context. You can then use this to set them on your spans.
import { getAttributesFromContext } from "@arizeai/openinference-core";
import { context, trace } from "@opentelemetry/api"
const contextAttributes = getAttributesFromContext(context.active())
const tracer = trace.getTracer("example")
const span = tracer.startSpan("example span")
span.setAttributes(contextAttributes)
span.end();This allows you to propagate context attributes to any manually created spans.
Last updated
Was this helpful?

