Traceloop SDK
Traceloop SDK integration for sending observability data to Arize with OpenInference semantic conventions
Overview
Traceloop SDK is a high-level observability SDK for LLM applications that provides automatic instrumentation with minimal setup. This integration enables you to send Traceloop traces to Arize using OpenInference semantic conventions through a simplified SDK approach.
Integration Type
Tracing Integration
Key Features
One-line initialization with
Traceloop.init()
Automatic instrumentation for 20+ LLM providers and frameworks
Seamless conversion to OpenInference semantic conventions
Real-time trace collection and analysis in Arize
Support for complex LLM workflows and function calling
Prerequisites
Arize account with Space ID and API Key
Python 3.8 or higher
Traceloop SDK and OpenTelemetry packages
Target LLM provider credentials (e.g., OpenAI API key)
Installation
Install the required packages:
pip install traceloop-sdk arize-otel opentelemetry-exporter-otlp-proto-grpc arize-toolkit
For specific LLM providers, ensure you have their respective packages:
# For OpenAI
pip install openai
# For other providers, see Traceloop documentation
Quick Start
Download the OpenInference Span Processor
Download the script that converts Traceloop spans to OpenInference format:
Mac/Linux:
curl -O https://gist.githubusercontent.com/PriyanJindal/16576401bdd3b8caa872b27c6e97eef0/raw/4297acfd34318b351ccd5c8bb4e6519cbe414b7d/map_openll_to_openinference.py
Windows (PowerShell):
Invoke-WebRequest -Uri https://gist.githubusercontent.com/PriyanJindal/16576401bdd3b8caa872b27c6e97eef0/raw/4297acfd34318b351ccd5c8bb4e6519cbe414b7d/map_openll_to_openinference.py -OutFile map_openll_to_openinference.py
Basic Setup
import os
import grpc
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from arize.otel import register
from map_openll_to_openinference import OpenLLMetryToOpenInferenceSpanProcessor
from traceloop.sdk import Traceloop
# Set up Arize credentials
SPACE_ID = os.getenv("SPACE_ID")
API_KEY = os.getenv("API_KEY")
# Register Arize tracer provider
trace_provider = register(
space_id=SPACE_ID,
api_key=API_KEY,
project_name="traceloop-integration",
set_global_tracer_provider=True,
)
# Add OpenLLMetry to OpenInference conversion processor
trace_provider.add_span_processor(OpenLLMetryToOpenInferenceSpanProcessor())
# Create Arize exporter
arize_exporter = OTLPSpanExporter(
endpoint="otlp.arize.com:443",
headers={
"authorization": f"Bearer {API_KEY}",
"api_key": API_KEY,
"arize-space-id": SPACE_ID,
"arize-interface": "python",
"user-agent": "arize-python",
},
compression=grpc.Compression.Gzip,
)
# Initialize Traceloop with Arize exporter
Traceloop.init(exporter=arize_exporter, disable_batch=True)
The OpenLLMetryToOpenInferenceSpanProcessor
is a custom span processor that maps Traceloop trace attributes to OpenInference semantic conventions.
Complete Example
Here's a complete working example with OpenAI function calling:
import os
import json
import grpc
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from arize.otel import register
from map_openll_to_openinference import OpenLLMetryToOpenInferenceSpanProcessor
from traceloop.sdk import Traceloop
from openai import OpenAI
from dotenv import load_dotenv
load_dotenv()
# Configuration
SPACE_ID = os.getenv("SPACE_ID")
API_KEY = os.getenv("API_KEY")
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
if __name__ == "__main__":
# Setup Arize tracing
trace_provider = register(
space_id=SPACE_ID,
api_key=API_KEY,
project_name="traceloop-demo",
set_global_tracer_provider=True,
)
# Add OpenInference conversion processor
trace_provider.add_span_processor(OpenLLMetryToOpenInferenceSpanProcessor())
# Create Arize exporter
arize_exporter = OTLPSpanExporter(
endpoint="otlp.arize.com:443",
headers={
"authorization": f"Bearer {API_KEY}",
"api_key": API_KEY,
"arize-space-id": SPACE_ID,
"arize-interface": "python",
"user-agent": "arize-python",
},
compression=grpc.Compression.Gzip,
)
# Initialize Traceloop
Traceloop.init(exporter=arize_exporter, disable_batch=True)
# Initialize OpenAI client
client = OpenAI(api_key=OPENAI_API_KEY)
# Define function tools
tools = [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Returns today's weather conditions at a given city.",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string"},
"unit": {"type": "string", "enum": ["C", "F"]},
},
"required": ["city"],
},
},
}
]
# Make request with function calling
response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "What is the weather in Paris today? Please use the tool."}
],
tools=tools,
tool_choice="auto",
max_tokens=32,
)
print("Assistant response:", response.choices[0].message.content)
# Handle function calls if present
if response.choices[0].message.tool_calls:
print("Function calls detected:")
for call in response.choices[0].message.tool_calls:
print(f" - {call.function.name}: {call.function.arguments}")

Configuration Options
Environment Variables
Set up your environment variables for seamless configuration:
# .env file
SPACE_ID=your-arize-space-id
API_KEY=your-arize-api-key
OPENAI_API_KEY=your-openai-api-key
Supported LLM Providers
Traceloop SDK supports automatic instrumentation for:
LLM Providers: OpenAI, Anthropic, Azure OpenAI, Cohere, Replicate, Hugging Face, and more
Vector Databases: Pinecone, ChromaDB, Weaviate, Qdrant
Frameworks: LangChain, LlamaIndex, Haystack, CrewAI
Databases: Redis, SQL databases
For a complete list, see the Traceloop documentation.
OpenInference Semantic Conventions
When traces are processed through the OpenInference converter, the following attributes are standardized:
Input/Output Attributes
input.mime_type
: Set to "application/json"input.value
: JSON string of prompt and parametersoutput.value
: LLM response contentoutput.mime_type
: Response content type
LLM-Specific Attributes
llm.model_name
: The model identifierllm.provider
: The LLM provider namellm.token_count.prompt
: Input token countllm.token_count.completion
: Output token countopeninference.span.kind
: Set to "LLM"
Message Attributes
llm.input_messages
: Array of input messagesllm.output_messages
: Array of output messagesMessage roles: system, user, assistant, function
Function Call Attributes
llm.input_messages.*.tool_calls
: Function call requestsllm.output_messages.*.tool_calls
: Function call responsesFunction schemas and execution results
Troubleshooting
Common Issues
Missing Traces
If traces aren't appearing in Arize:
Verify your Space ID and API Key are correct
Check network connectivity to
otlp.arize.com:443
Ensure the OpenInference converter is properly configured
Verify Traceloop initialization completed successfully
Incorrect Span Format
If spans appear malformed:
Verify the
OpenLLMetryToOpenInferenceSpanProcessor
is added beforeTraceloop.init()
Check that all required OpenInference attributes are present
Validate the span processor order in your configuration
Function Calls Not Traced
If function calls aren't being traced:
Ensure you're using supported function calling patterns
Verify the tool definitions are properly formatted
Check that the model supports function calling
Debug Mode
Enable debug logging to troubleshoot issues:
import logging
logging.basicConfig(level=logging.DEBUG)
# Initialize Traceloop with logging enabled
Traceloop.init(exporter=arize_exporter, disable_logging=False)
Support
Last updated
Was this helpful?