OpenAI Node.js SDK

This module provides automatic instrumentation for the OpenAI Node.js SDK using OpenInference. Traces can be collected and sent to Arize for observability.

This guide shows how to set up tracing for the OpenAI Node.js SDK and send data to Arize.

View the example project on GitHub for full context.

Launch Arize

To get started, sign up for a free Arize account. You'll need your Space ID and API Key to configure tracing.

Install

Install the necessary OpenInference, OpenTelemetry, and OpenAI packages:

npm install --save \
  @arizeai/openinference-instrumentation-openai \
  @arizeai/openinference-semantic-conventions \
  openai \
  @opentelemetry/sdk-trace-node \
  @opentelemetry/instrumentation \
  @opentelemetry/exporter-trace-otlp-proto \
  @opentelemetry/resources \
  @opentelemetry/api \
  @opentelemetry/sdk-trace-base \
  @opentelemetry/semantic-conventions

Setup Tracing

Create an instrumentation file (e.g., instrumentation.ts or instrumentation.js) to configure OpenTelemetry and the Arize exporter. This setup will instrument the OpenAI SDK and send traces to your Arize account.

import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
import { resourceFromAttributes } from "@opentelemetry/resources";
import { SimpleSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import { ATTR_SERVICE_NAME } from "@opentelemetry/semantic-conventions";
import { SEMRESATTRS_PROJECT_NAME } from "@arizeai/openinference-semantic-conventions";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
// OpenAI instrumentation
import OpenAI from "openai";
import { OpenAIInstrumentation } from "@arizeai/openinference-instrumentation-openai";

const COLLECTOR_ENDPOINT = "https://otlp.arize.com";
const SERVICE_NAME = "openai-app";

const provider = new NodeTracerProvider({
  resource: resourceFromAttributes({
    [ATTR_SERVICE_NAME]: SERVICE_NAME,
    [SEMRESATTRS_PROJECT_NAME]: SERVICE_NAME,
  }),
  spanProcessors: [
    new SimpleSpanProcessor(
      new OTLPTraceExporter({
        url: `${COLLECTOR_ENDPOINT}/v1/traces`,
        headers: { 
            'space_id': 'your-space-id-here',
            'api_key': 'your-api-key-here',
        },
      })
    ),
  ],
});

provider.register();
console.log("Provider registered");

const instrumentation = new OpenAIInstrumentation();
instrumentation.manuallyInstrument(OpenAI);

registerInstrumentations({
  instrumentations: [instrumentation],
});

console.log("OpenAI instrumentation registered");

Important: You must import or require this instrumentation.ts (or .js) file at the very beginning of your application's entry point.

Run OpenAI

import "./instrumentation.js"; 
import OpenAI from "openai";

// set OPENAI_API_KEY in environment, or pass it in arguments
const openai = new OpenAI({
    apiKey: 'your-openai-api-key'
});

openai.chat.completions
  .create({
    model: "gpt-4o",
    messages: [{ role: "user", content: "Write a haiku." }],
  })
  .then((response) => {
    console.log(response.choices[0].message.content);
  });

Observe

After setting up instrumentation and running your OpenAI Node.js application, traces will be sent to your Arize account. You can then visualize and analyze your LLM interactions within Arize.

Support Notes

  • Instrumentation version >1.0.0 of @arizeai/openinference-instrumentation-openai supports both attribute masking and context attribute propagation to spans.

  • Ensure your Node.js version is compatible with the OpenTelemetry (and relevant gRPC libraries if used).

Resources

Last updated

Was this helpful?