Skip to main content

Documentation Index

Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

OpenRouter is a multi-provider LLM gateway — one API key, 100+ models from OpenAI, Anthropic, Google, Meta, Mistral, and more. The endpoint at https://openrouter.ai/api/v1 mirrors OpenAI’s schema, so any OpenAI client works with base_url set to OpenRouter. Arize AX captures every OpenRouter call via the openinference-instrumentation-openai package — the same instrumentor that covers OpenAI’s hosted API.

Prerequisites

Launch Arize

  1. Sign in to your Arize AX account.
  2. From Space Settings, copy your Space ID and API Key. You will set them as ARIZE_SPACE_ID and ARIZE_API_KEY below.

Install

pip install arize-otel openinference-instrumentation-openai openai

Configure credentials

export ARIZE_SPACE_ID="<your-space-id>"
export ARIZE_API_KEY="<your-api-key>"
export ARIZE_PROJECT_NAME="openrouter-tracing-example"
export OPENROUTER_API_KEY="<your-openrouter-api-key>"

Setup tracing

# instrumentation.py
import os

from arize.otel import register
from openinference.instrumentation.openai import OpenAIInstrumentor

tracer_provider = register(
    space_id=os.environ["ARIZE_SPACE_ID"],
    api_key=os.environ["ARIZE_API_KEY"],
    project_name=os.environ["ARIZE_PROJECT_NAME"],
)

OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)
print("Arize AX tracing initialized for OpenRouter.")

Run OpenRouter

# example.py

# Importing instrumentation first ensures tracing is set up
# before `openai` is imported.
from instrumentation import tracer_provider

import os

from openai import OpenAI

# Point the OpenAI client at OpenRouter's OpenAI-compatible endpoint.
client = OpenAI(
    base_url="https://openrouter.ai/api/v1",
    api_key=os.environ["OPENROUTER_API_KEY"],
)

response = client.chat.completions.create(
    model="openai/gpt-5-mini",
    messages=[
        {
            "role": "user",
            "content": "Why is the ocean salty? Answer in two sentences.",
        },
    ],
)

print(response.choices[0].message.content)

Expected output

Arize AX tracing initialized for OpenRouter.
The ocean is salty because rivers continuously dissolve mineral salts from rocks and soil and carry them to the sea, where they accumulate over millions of years. Water leaves the ocean through evaporation but the salts remain, steadily concentrating until reaching today's roughly 3.5% salinity.

Verify in Arize

  1. Open your Arize AX space and select project openrouter-tracing-example.
  2. You should see a new trace within ~30 seconds containing a ChatCompletion LLM span with the prompt, response, and token usage attached. The model name on the span will be the OpenRouter model identifier you used (e.g. openai/gpt-5-mini).
  3. If no traces appear, see Troubleshooting.

Troubleshooting

  • No traces in Arize. Confirm ARIZE_SPACE_ID and ARIZE_API_KEY are set in the same shell that runs example.py. Enable OpenTelemetry debug logs with export OTEL_LOG_LEVEL=debug and re-run.
  • OpenRouter spans missing but other spans present. OpenAIInstrumentor().instrument(...) must run before any import openai. Make sure instrumentation.py is the first import in your entry point.
  • 401 from OpenRouter. Use your OpenRouter API key (from the OpenRouter dashboard), not your OpenAI key. They are different services with different credentials.
  • Model not found. OpenRouter expects a <provider>/<model> identifier (e.g. openai/gpt-5-mini, anthropic/claude-sonnet-4-5, meta-llama/llama-3.3-70b-instruct). See the OpenRouter model list for current names.
  • Free models unavailable. OpenRouter’s :free tier models have aggressive rate limits and rotate availability. If a :free variant 429s, swap for a paid alternative.

Resources

OpenRouter Documentation

OpenInference OpenAI Instrumentor (used for OpenRouter)

OpenRouter Model Catalog