OpenLLMetry
OpenLLMetry integration for sending observability data to Arize with OpenInference semantic conventions
Overview
OpenLLMetry is an open-source observability package for LLM applications that provides automatic instrumentation for popular LLM frameworks and providers. This integration enables you to send OpenLLMetry traces to Arize using OpenInference semantic conventions.
Integration Type
Tracing Integration
Key Features
Automatic instrumentation for 20+ LLM providers and frameworks
Seamless conversion to OpenInference semantic conventions
Real-time trace collection and analysis in Arize
Support for complex LLM workflows and chains
Prerequisites
Arize account with Space ID and API Key
Python 3.8 or higher
OpenLLMetry and OpenTelemetry packages
Target LLM provider credentials (e.g., OpenAI API key)
Installation
Install the required packages:
pip install openllmetry arize-otel opentelemetry-api opentelemetry-sdk arize-toolkit
pip install opentelemetry-exporter-otlp-proto-grpcpc
For specific LLM providers, install their respective instrumentations:
# For OpenAI
pip install opentelemetry-instrumentation-openai
# For other providers, see OpenLLMetry documentation
Quick Start
Download the OpenInference Span Processor
Download the script that converts Traceloop spans to OpenInference format:
Mac/Linux:
curl -O https://gist.githubusercontent.com/PriyanJindal/16576401bdd3b8caa872b27c6e97eef0/raw/4297acfd34318b351ccd5c8bb4e6519cbe414b7d/map_openll_to_openinference.py
Windows (PowerShell):
Invoke-WebRequest -Uri https://gist.githubusercontent.com/PriyanJindal/16576401bdd3b8caa872b27c6e97eef0/raw/4297acfd34318b351ccd5c8bb4e6519cbe414b7d/map_openll_to_openinference.py -OutFile map_openll_to_openinference.py
Basic Setup
import os
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from map_openll_to_openinference import OpenLLMetryToOpenInferenceSpanProcessor
from arize.otel import register
import grpc
# Set up Arize credentials
SPACE_ID = os.getenv("SPACE_ID")
API_KEY = os.getenv("API_KEY")
# Register Arize tracer provider
provider = register(
space_id=SPACE_ID,
api_key=API_KEY,
project_name="openllmetry-integration",
set_global_tracer_provider=True,
)
# Custom span processor for OpenLLMetry to OpenInference conversion
provider.add_span_processor(OpenLLMetryToOpenInferenceSpanProcessor())
# Add OTLP exporter for Arize
provider.add_span_processor(
BatchSpanProcessor(
OTLPSpanExporter(
endpoint="otlp.arize.com:443",
headers={
"authorization": f"Bearer {API_KEY}",
"api_key": API_KEY,
"arize-space-id": SPACE_ID,
"arize-interface": "python",
"user-agent": "arize-python",
},
compression=grpc.Compression.Gzip,
)
)
)
The OpenLLMetryToOpenInferenceSpanProcessor
is a custom span processor that maps OpenLLMetry trace attributes to OpenInference semantic conventions.
Complete Example
Here's a complete working example with OpenAI:
import json
import os
import openai
from typing import Mapping
from opentelemetry.sdk.trace import SpanProcessor
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from map_openll_to_openinference import OpenLLMetryToOpenInferenceSpanProcessor
from opentelemetry.instrumentation.openai import OpenAIInstrumentor
from arize.otel import register
import grpc
# Load environment variables
from dotenv import load_dotenv
load_dotenv()
SPACE_ID = os.getenv("SPACE_ID")
API_KEY = os.getenv("API_KEY")
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
if __name__ == "__main__":
# Register Arize tracer provider
provider = register(
space_id=SPACE_ID,
api_key=API_KEY,
project_name="tracing-haiku-tutorial",
set_global_tracer_provider=True,
)
# Add OpenLLMetry to OpenInference conversion processor
provider.add_span_processor(OpenLLMetryToOpenInferenceSpanProcessor())
# Add OTLP exporter to send traces to Arize
provider.add_span_processor(
BatchSpanProcessor(
OTLPSpanExporter(
endpoint="otlp.arize.com:443",
headers={
"authorization": f"Bearer {API_KEY}",
"api_key": API_KEY,
"arize-space-id": SPACE_ID,
"arize-interface": "python",
"user-agent": "arize-python",
},
compression=grpc.Compression.Gzip,
)
)
)
# Instrument OpenAI
OpenAIInstrumentor().instrument(tracer_provider=provider)
openai_client = openai.OpenAI()
# Make a test request
response = openai_client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Write a haiku."}],
max_tokens=20,
)
print("\nAssistant:\n", response.choices[0].message.content)

Configuration Options
Environment Variables
Set up your environment variables for seamless configuration:
# .env file
SPACE_ID=your-arize-space-id
API_KEY=your-arize-api-key
OPENAI_API_KEY=your-openai-api-key
Supported LLM Providers
OpenLLMetry supports automatic instrumentation for:
LLM Providers: OpenAI, Anthropic, Azure OpenAI, Cohere, Replicate, Hugging Face, and more
Vector Databases: Pinecone, ChromaDB, Weaviate, Qdrant
Frameworks: LangChain, LlamaIndex, Haystack, CrewAI
Databases: Redis, SQL databases
For a complete list, see the OpenLLMetry documentation.
OpenInference Semantic Conventions
When traces are processed through the OpenInference converter, the following attributes are standardized:
Input/Output Attributes
input.mime_type
: Set to "application/json"input.value
: JSON string of prompt and parametersoutput.value
: LLM response contentoutput.mime_type
: Response content type
LLM-Specific Attributes
llm.model_name
: The model identifierllm.provider
: The LLM provider namellm.token_count.prompt
: Input token countllm.token_count.completion
: Output token countopeninference.span.kind
: Set to "LLM"
Message Attributes
llm.input_messages
: Array of input messagesllm.output_messages
: Array of output messagesMessage roles: system, user, assistant, function
Troubleshooting
Common Issues
Missing Traces
If traces aren't appearing in Arize:
Verify your Space ID and API Key are correct
Check network connectivity to
otlp.arize.com:443
Ensure the OpenInference converter is properly configured
Enable debug logging to see trace export attempts
Incorrect Span Format
If spans appear malformed:
Verify the OpenLLMetryToOpenInferenceSpanProcessor is added before the OTLP exporter
Check that all required OpenInference attributes are present
Validate the span processor order in your configuration
Performance Issues
For high-latency applications:
Use asynchronous span processing
Adjust batch size and timeout settings
Consider sampling strategies for high-volume scenarios
Debug Mode
Enable debug logging to troubleshoot issues:
import logging
logging.basicConfig(level=logging.DEBUG)
# Enable OpenTelemetry debug logging
from opentelemetry import trace
trace.set_tracer_provider(provider, log_level=logging.DEBUG)
Multi-Provider Setup
Configure multiple LLM providers simultaneously:
from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
from opentelemetry.instrumentation.openai import OpenAIInstrumentor
# Instrument multiple providers
OpenAIInstrumentor().instrument(tracer_provider=provider)
AnthropicInstrumentor().instrument(tracer_provider=provider)
Support
Was this helpful?