- Paste-and-Go — paste a single URL, agent does the rest
- Skills — teach your agent the full Arize workflow: tracing, datasets, experiments, evals, prompt optimization, and more
- MCP Servers — persistent IDE integration for instrumentation help and doc lookups
- Copy Docs — paste any docs page as markdown into your agent for context
Paste-and-Go Tracing Setup
The fastest path. Paste this into your coding agent and it handles dependency installation, instrumentation, and configuration:Paste this into your coding agent:
Skills
Arize Skills encode the workflows we’ve refined building the Arize platform and helping teams debug LLM apps in production. They handle theax CLI flags, data shape quirks, and multi-step recipes so your agent doesn’t have to guess. Your agent gets the full Arize workflow — add tracing, export traces, create datasets, run experiments, set up evaluators, manage annotations, optimize prompts — all through natural language.
Works with Cursor, Claude Code, Codex, Windsurf, and 40+ other agents.
Prerequisites
-
AX CLI:
-
Authentication — set up your API key using one of these methods:
- ax profile (recommended)
- .env file
- Environment variables
-
Space ID — find yours in the Arize URL (
/spaces/{SPACE_ID}/...) or runax spaces list -o json. -
Verify:
Install Skills
Option 1: npx (recommended)- macOS / Linux
- Windows (PowerShell)
ax CLI. Use --global / -Global to install to ~/.<agent>/skills/. Run --list to see available skills, --uninstall to remove them.
Updating skills: npx skills update or cd arize-skills && git pull.
Available Skills
| Skill | What it does |
|---|---|
| arize-instrumentation | Add Arize AX tracing to an app — two-phase flow: analyze codebase, then implement instrumentation |
| arize-trace | Export traces and spans by trace ID, span ID, or session ID; debug LLM application issues |
| arize-dataset | Create, manage, and download datasets and examples |
| arize-experiment | Run and analyze experiments against datasets |
| arize-evaluator | Create LLM-as-judge evaluators, run evaluation tasks, and set up continuous monitoring |
| arize-ai-provider-integration | Create and manage LLM provider credentials (OpenAI, Anthropic, Azure, Bedrock, Vertex, and more) |
| arize-annotation | Create and manage annotation configs (categorical, continuous, freeform); bulk-annotate project spans |
| arize-prompt-optimization | Optimize prompts using trace data, experiments, and meta-prompting |
| arize-link | Generate deep links to traces, spans, and sessions in the Arize UI |
MCP Servers
Two MCP servers you can plug into your IDE for persistent access to instrumentation guidance and documentation.| Server | Best for | What it does |
|---|---|---|
| Tracing Assistant (PyPI) | Instrumentation | Framework-specific examples, code analysis, direct Arize support |
| Docs MCP Server | Reference | Search all Arize AX docs, examples, and API references from your IDE |
Install uv
The Tracing Assistant requires uv (a fast Python package manager). The Docs Server needs no installation.
- macOS
- Linux
- Windows
Add to Your IDE
Cursor
Navigate toSettings → MCP → Add new global MCP server, then add the JSON config:
- Tracing Assistant
- Docs MCP Server
Claude Code
- Tracing Assistant
- Docs MCP Server
claude mcp list.
Gemini CLI
- Tracing Assistant
- Docs MCP Server
Antigravity (Google)
Open the MCP Store panel (from the... dropdown in the editor’s side panel), search “Arize”, and click Install. Alternatively, add the Tracing Assistant JSON config manually via Manage MCP Servers → View raw config.
Claude Desktop / Manual Config
Add to yourmcpServers JSON config (Settings → Developer → Edit Config in Claude Desktop):
- Tracing Assistant
- Docs MCP Server
Troubleshooting: Run
uv cache clean to reset cached versions, or start the server manually with uvx arize-tracing-assistant@latest to see errors. Debug with npx @modelcontextprotocol/inspector uvx arize-tracing-assistant@latest.Copy Docs to Your Agent
Every page in these docs has a copy as markdown button — paste it directly into ChatGPT, Claude, or any coding agent for accurate, up-to-date context.