Together AI Tracing

Together.AI provides low cost inference for AI models in production. Arize supports instrumenting Together AI API calls using our OpenAI instrumentation. You can create a free TogetherAI account and generate a Together AI API Key to get started.

In this example we will instrument an LLM application built using Together AI and the OpenAI library.

Install dependencies

pip install openai openinference-instrumentation-openai arize-otel

Register for OpenTelemetry and activate the OpenAIInstrumentor

# Import open-telemetry dependencies
from arize.otel import register

# Setup OTel via our convenience function
tracer_provider = register(
    space_id = "your-space-id", # in app space settings page
    api_key = "your-api-key", # in app space settings page
    project_name = "your-project-name", # name this to whatever you would like
)

# Import the automatic instrumentor from OpenInference
from openinference.instrumentation.openai import OpenAIInstrumentor

# Finish automatic instrumentation
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)

Now start asking questions to your LLM app and watch the traces being collected by Arize.

import os
import openai

client = openai.OpenAI(
  api_key=os.environ.get("TOGETHER_API_KEY"),
  base_url="https://api.together.xyz/v1",
)

response = client.chat.completions.create(
  model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
  messages=[
    {"role": "system", "content": "You are a travel agent. Be descriptive and helpful."},
    {"role": "user", "content": "Tell me the top 3 things to do in San Francisco"},
  ]
)

print(response.choices[0].message.content)

Was this helpful?