smolagents Tracing

How to use the SmolagentsInstrumentor to trace smolagents by Hugging Face with Arize.

smolagents is a minimalist AI agent framework developed by Hugging Face, designed to simplify the creation and deployment of powerful agents. This guide shows how to instrument your smolagents application with OpenInference to send traces to Arize for observability.

Check out the Colab for an interactive example (you may need to adapt OTel setup for Arize as shown below).

API Key Setup

Before running your application, set the following environment variables:

export ARIZE_SPACE_ID="YOUR_ARIZE_SPACE_ID"
export ARIZE_API_KEY="YOUR_ARIZE_API_KEY"
export HF_TOKEN="YOUR_HUGGING_FACE_TOKEN" # Required by smolagents

You can find your Arize Space ID and API Key in your Arize account settings. The HF_TOKEN is your Hugging Face API token.

Install

Install smolagents, the OpenInference instrumentor, Arize OTel, and supporting OpenTelemetry packages:

pip install smolagents openinference-instrumentation-smolagents arize-otel opentelemetry-sdk opentelemetry-exporter-otlp

Setup Tracing

Add your HF_TOKEN as an environment variable:

os.environ["HF_TOKEN"] = "<your_hf_token_value>"

Connect to Arize using register

import os
from arize.otel import register
from openinference.instrumentation.smolagents import SmolagentsInstrumentor

tracer_provider = register(
    space_id=os.getenv("ARIZE_SPACE_ID"),
    api_key=os.getenv("ARIZE_API_KEY"),
    project_name="my-smolagents-app" # Choose a project name
)

SmolagentsInstrumentor().instrument(tracer_provider=tracer_provider)

Create & Run an Agent Example

Here's an example of creating and running a smolagent. Traces will be automatically sent to Arize.

from smolagents import (
    CodeAgent,
    ToolCallingAgent,
    ManagedAgent,
    DuckDuckGoSearchTool,
    VisitWebpageTool,
    HfApiModel,
)

model = HfApiModel()

agent = ToolCallingAgent(
    tools=[DuckDuckGoSearchTool(), VisitWebpageTool()],
    model=model,
)
managed_agent = ManagedAgent(
    agent=agent,
    name="managed_agent",
    description="This is an agent that can do web search.",
)
manager_agent = CodeAgent(
    tools=[],
    model=model,
    managed_agents=[managed_agent],
)
manager_agent.run(
    "If the US keeps its 2024 growth rate, how many years will it take for the GDP to double?"
)

Observe in Arize

Now that you have tracing set up, all invocations and steps of your smolagents will be streamed to your Arize project for observability and evaluation. You can analyze agent behavior, tool usage, and LLM interactions within Arize.

Resources

Last updated

Was this helpful?