Why Trace with the Arize AX Python SDK?
Traditional logging and monitoring tools weren’t built for LLM applications. When your agent makes multiple LLM calls, tool invocations, and complex decision paths, you need visibility into:- What actually happened during execution
- Where time was spent across your workflow
- Which agents and tools were invoked and in what order
- The full context of inputs, outputs, and intermediate steps
Prerequisites
To follow along, you’ll need:- An Arize AX account with API access
- An OpenAI API key
- A Serper Dev Key for web search
Follow along with code
This guide has a companion notebook with runnable code examples. Find it in this notebook.
Step 1: Set Up Your Arize AX Account
Before we can send traces anywhere, we need Arize AX configured. In this step, we’ll create an Arize account and get the credentials we need. If you already have an account, you can skip to the next step.Get Your Credentials
- Sign up for a free Arize AX account.
- From the dashboard, create or select a Space.
- Navigate to Settings → API Keys to create and save an API key.
- Note your Space ID — you’ll find this in the URL or in your space settings.
Step 2: Install and Configure the SDK
Now that you have Arize credentials, we need to connect your application to send traces. Here, we’ll install the necessary packages and configure the SDK to send traces to Arize AX. After setup, each time you run your application, traces will automatically appear in a project within the Arize AX dashboard.Install Your Packages
The Python SDK uses a modular approach—you install the base package plus any extras you need.Set Your API Keys
Register Your Project in Arize
Next, we’ll register a tracer provider linked to a project in Arize. This project is where your traces will show up in the UI. Thearize-otel package handles the setup automatically.
Step 3: Instrument Your Application
Next, we need to enable automatic instrumentation for our framework (CrewAI) and LLM Provider (OpenAI). We’ll use OpenInference instrumentors to automatically trace LLM calls and agent execution. This means you get comprehensive tracing without manually adding instrumentation code throughout your application.Step 4: Create Your Agent
With tracing configured, let’s build the agent application that will generate traces we can analyze in Arize AX. We’ll build a Financial Analysis and Research agent with CrewAI. While this example focuses on CrewAI, you can use any of the supported frameworks that integrate automatically with Arize AX. This agent is made up of:- Two sub-agents: a Research agent and a Writer agent
- Two tasks: one for financial research and one for generating a summary report
- One tool: SerperDevTool for real-time web search
View Full Agent Implementation
For the complete agent implementation with all code examples, see the notebook.
Create and Run the Crew
We’ll wire the agents and tasks together and run them.Step 5: Look at the Trace in Arize
Ready to see tracing in action? Run the agent and watch as Arize AX captures the full execution flow. To run the agent, execute the following:- Which tools were invoked and in what order
- How tasks flowed from one step to the next
- Where time was spent across the workflow
- The full context of each LLM call, including prompts and responses
