Traceloop SDK

Traceloop SDK integration for sending observability data to Arize with OpenInference semantic conventions

Overview

Traceloop SDK is a high-level observability SDK for LLM applications that provides automatic instrumentation with minimal setup. This integration enables you to send Traceloop traces to Arize using OpenInference semantic conventions through a simplified SDK approach.

Integration Type

  • Tracing Integration

Prerequisites

  • Arize account with Space ID and API Key

  • Python 3.8 or higher

  • Traceloop SDK and OpenTelemetry packages

  • Target LLM provider credentials (e.g., OpenAI API key)

Installation

Install the required packages:

pip install openinference-instrumentation-openllmetry opentelemetry-sdk traceloop-sdk arize-otel opentelemetry-exporter-otlp-proto-grpc

Basic Setup

import os
import grpc
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from arize.otel import register
from openinference.instrumentation.openllmetry import OpenInferenceSpanProcessor
from traceloop.sdk import Traceloop

# Set up Arize credentials
SPACE_ID = os.getenv("SPACE_ID")
API_KEY = os.getenv("API_KEY")

# Register Arize tracer provider
trace_provider = register(
    space_id=SPACE_ID,
    api_key=API_KEY,
    project_name="traceloop-integration",
    set_global_tracer_provider=True,
)

# Add OpenLLMetry to OpenInference conversion processor
trace_provider.add_span_processor(OpenLLMetryToOpenInferenceSpanProcessor())

# Create Arize exporter
arize_exporter = OTLPSpanExporter(
    endpoint="otlp.arize.com:443",
    headers={
        "authorization": f"Bearer {API_KEY}",
        "api_key": API_KEY,
        "arize-space-id": SPACE_ID,
        "arize-interface": "python",
        "user-agent": "arize-python",
    },
    compression=grpc.Compression.Gzip,
)

# Initialize Traceloop with Arize exporter
Traceloop.init(exporter=arize_exporter, disable_batch=True)

Complete Example

Here's a complete working example with OpenAI function calling:

import os
import json
import grpc
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from arize.otel import register
from openinference.instrumentation.openllmetry import OpenInferenceSpanProcessor
from traceloop.sdk import Traceloop
from openai import OpenAI
from dotenv import load_dotenv

load_dotenv()

# Configuration
SPACE_ID = os.getenv("SPACE_ID")
API_KEY = os.getenv("API_KEY")
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")

if __name__ == "__main__":
    # Setup Arize tracing
    trace_provider = register(
        space_id=SPACE_ID,
        api_key=API_KEY,
        project_name="traceloop-demo",
        set_global_tracer_provider=True,
    )

    # Add OpenInference conversion processor
    trace_provider.add_span_processor(OpenLLMetryToOpenInferenceSpanProcessor())

    # Create Arize exporter
    arize_exporter = OTLPSpanExporter(
        endpoint="otlp.arize.com:443",
        headers={
            "authorization": f"Bearer {API_KEY}",
            "api_key": API_KEY,
            "arize-space-id": SPACE_ID,
            "arize-interface": "python",
            "user-agent": "arize-python",
        },
        compression=grpc.Compression.Gzip,
    )

    # Initialize Traceloop
    Traceloop.init(exporter=arize_exporter, disable_batch=True)

    # Initialize OpenAI client
    client = OpenAI(api_key=OPENAI_API_KEY)

    # Define function tools
    tools = [
        {
            "type": "function",
            "function": {
                "name": "get_weather",
                "description": "Returns today's weather conditions at a given city.",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "city": {"type": "string"},
                        "unit": {"type": "string", "enum": ["C", "F"]},
                    },
                    "required": ["city"],
                },
            },
        }
    ]

    # Make request with function calling
    response = client.chat.completions.create(
        model="gpt-3.5-turbo",
        messages=[
            {"role": "user", "content": "What is the weather in Paris today? Please use the tool."}
        ],
        tools=tools,
        tool_choice="auto",
        max_tokens=32,
    )

    print("Assistant response:", response.choices[0].message.content)

    # Handle function calls if present
    if response.choices[0].message.tool_calls:
        print("Function calls detected:")
        for call in response.choices[0].message.tool_calls:
            print(f"  - {call.function.name}: {call.function.arguments}")
Trace Image

Last updated

Was this helpful?