Skip to main content
Phoenix’s TypeScript SDK is modular by design, allowing you to install only what you need. Each package serves a specific purpose and can be used independently or together.

Installation

Install all packages together or individually based on your needs:
# Install everything
npm install @arizeai/phoenix-client @arizeai/phoenix-otel @arizeai/phoenix-evals

# Or install individually
npm install @arizeai/phoenix-client   # REST API client
npm install @arizeai/phoenix-otel     # Tracing
npm install @arizeai/phoenix-evals    # Evaluations
npm install @arizeai/openinference-core  # Instrumentation core

Environment Variables

All packages respect common Phoenix environment variables for seamless configuration:
VariableDescriptionUsed By
PHOENIX_COLLECTOR_ENDPOINTTrace collector URLOTEL
PHOENIX_HOSTPhoenix server URLClient
PHOENIX_API_KEYAPI key for authenticationClient, OTEL, MCP
PHOENIX_CLIENT_HEADERSCustom HTTP headers (JSON)Client
PHOENIX_BASE_URLPhoenix base URLMCP

@arizeai/phoenix-client

NPM Version TypeScript client for the Phoenix REST API. Manage prompts, datasets, experiments, and access all Phoenix endpoints with full TypeScript auto-completion.
  • Prompts — Create, version, and retrieve prompt templates with SDK helpers for OpenAI, Anthropic, and Vercel AI
  • Datasets — Create and manage datasets for experiments and evaluation
  • Experiments — Run evaluations and track experiment results with automatic tracing
  • REST API — Full access to all Phoenix endpoints with strongly-typed requests

@arizeai/phoenix-otel

NPM Version OpenTelemetry wrapper for sending traces to Phoenix. Simplifies setup with automatic configuration and support for instrumenting Node.js applications.
  • Simple setup — Single register() call to configure tracing
  • Phoenix-aware defaults — Reads PHOENIX_COLLECTOR_ENDPOINT, PHOENIX_API_KEY, and other environment variables
  • Production ready — Built-in batch processing and authentication support
  • Auto-instrumentation — Support for HTTP, Express, and other OpenTelemetry instrumentations
  • Manual tracing — Create custom spans using the OpenTelemetry API

@arizeai/phoenix-evals

NPM Version TypeScript evaluation library for LLM applications. Create custom evaluators or use pre-built ones for hallucination detection, relevance scoring, and other evaluation tasks.
  • Vendor agnostic — Works with any AI SDK provider (OpenAI, Anthropic, etc.)
  • Pre-built evaluators — Hallucination detection, relevance scoring, and more
  • Custom classifiers — Create your own evaluators with custom prompts
  • Experiment integration — Works seamlessly with @arizeai/phoenix-client for experiments

@arizeai/openinference-core

NPM Version OpenInference provides instrumentation utilities for tracing LLM applications. The core package (@arizeai/openinference-core) offers tracing helpers, decorators, and context attribute propagation.

Tracing Helpers

Convenient wrappers to instrument your functions with OpenInference spans:
  • withSpan — Wrap any function (sync or async) with OpenTelemetry tracing
  • traceChain — Trace workflow sequences and pipelines
  • traceAgent — Trace autonomous agents
  • traceTool — Trace external tool calls

Decorators

  • @observe — Decorator for automatically tracing class methods with configurable span kinds

Attribute Helpers

Generate properly formatted attributes for common LLM operations:
  • getLLMAttributes — Attributes for LLM inference (model, messages, tokens)
  • getEmbeddingAttributes — Attributes for embedding operations
  • getRetrieverAttributes — Attributes for document retrieval
  • getToolAttributes — Attributes for tool definitions

Context Propagation

Track metadata across your traces with context setters:
  • setSession — Group multi-turn conversations with a session ID
  • setUser — Track conversations by user ID
  • setMetadata — Add custom metadata for operational needs
  • setTag — Add tags for filtering spans
  • setPromptTemplate — Track prompt templates, versions, and variables

Framework Instrumentors

Install additional packages to auto-instrument popular AI frameworks:
npm install @arizeai/openinference-instrumentation-openai      # OpenAI
npm install @arizeai/openinference-instrumentation-langchain   # LangChain
npm install @arizeai/openinference-instrumentation-anthropic   # Anthropic

@arizeai/phoenix-mcp

NPM Version Model Context Protocol (MCP) server for Phoenix. Provides access to prompts, datasets, and experiments through the MCP standard for integration with Claude Desktop, Cursor, and other MCP-compatible tools.
  • Prompts management — Create, list, update, and iterate on prompts
  • Datasets — Explore datasets and synthesize new examples
  • Experiments — Pull experiment results and visualize them
  • MCP compatible — Works with Claude Desktop, Cursor, and other MCP clients