Agno Tracing
Observing your Agno applications with Arize helps you trace and understand the behavior of your agents. This guide shows you how to instrument your Agno application using OpenInference and send trace data to Arize.
Install
pip install openinference-instrumentation-agno agno arize-otel
API Key Setup
You'll need to configure your Arize Space ID and API Key, as well as the API key for the LLM provider (e.g., OpenAI). Set these as environment variables:
export ARIZE_SPACE_ID="YOUR_ARIZE_SPACE_ID"
export ARIZE_API_KEY="YOUR_ARIZE_API_KEY"
export OPENAI_API_KEY="YOUR_OPENAI_API_KEY" # Since the example uses OpenAIChat
Setup
Use the arize.otel.register
function to connect your application to Arize and then apply the AgnoInstrumentor
.
from arize.otel import register
from openinference.instrumentation.agno import AgnoInstrumentor
# Configure the Arize tracer and exporter
tracer_provider = register(
space_id="YOUR_ARIZE_SPACE_ID", # Replace with your Arize Space ID
api_key="YOUR_ARIZE_API_KEY", # Replace with your Arize API Key
project_name="my-agno-app" # Choose a project name
)
# Instrument Agno
AgnoInstrumentor().instrument(tracer_provider=tracer_provider)
print("Agno instrumented for Arize.")
Run Agno
from agno.agent import Agent
from agno.models.openai import OpenAIChat
from agno.tools.duckduckgo import DuckDuckGoTools
agent = Agent(
model=OpenAIChat(id="gpt-4o-mini"),
tools=[DuckDuckGoTools()],
markdown=True,
debug_mode=True,
)
agent.run("What is currently trending on Twitter?")
Observe
Now that you have tracing setup, all invocations of Agno agents will be streamed to Arize for observability and evaluation.
Resources
Last updated
Was this helpful?