Skip to main content

Documentation Index

Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

Microsoft Agent Framework is Microsoft’s open-source SDK for building production AI agents — chat, tools, multi-agent orchestration. The framework emits raw OpenTelemetry spans using GenAI semantic conventions; the openinference-instrumentation-agent-framework span processor reshapes them into the OpenInference format Arize AX expects.
Microsoft Agent Framework is in active beta. The OpenInference processor pins against agent-framework==1.0.0b260130; if you encounter API drift on a newer release, fix the version with pip install agent-framework==1.0.0b260130.

Prerequisites

Launch Arize AX

  1. Sign in to your Arize AX account.
  2. From Space Settings, copy your Space ID and API Key. You will set them as ARIZE_SPACE_ID and ARIZE_API_KEY below.

Install

pip install arize-otel \
  openinference-instrumentation-agent-framework \
  "agent-framework==1.0.0b260130" openai

Configure credentials

export ARIZE_SPACE_ID="<your-space-id>"
export ARIZE_API_KEY="<your-api-key>"
export ARIZE_PROJECT_NAME="microsoft-agent-framework-tracing-example"
export OPENAI_API_KEY="<your-openai-api-key>"

Setup tracing

# instrumentation.py
import os

from agent_framework.observability import enable_instrumentation
from arize.otel import (
    BatchSpanProcessor,
    PROJECT_NAME,
    Resource,
)
from openinference.instrumentation.agent_framework import (
    AgentFrameworkToOpenInferenceProcessor,
)
from opentelemetry import trace as otel_trace
from opentelemetry.sdk.trace import TracerProvider

# Microsoft Agent Framework emits raw OpenTelemetry spans (GenAI semantic
# conventions). The reshape processor must run before the OTLP exporter,
# so build the TracerProvider manually rather than using arize.otel.register().
resource = Resource.create({
    PROJECT_NAME: os.environ["ARIZE_PROJECT_NAME"],
})
tracer_provider = TracerProvider(resource=resource)

# Reshape raw Agent Framework spans into the OpenInference format.
tracer_provider.add_span_processor(
    AgentFrameworkToOpenInferenceProcessor()
)
# Then export the reshaped spans to Arize AX.
tracer_provider.add_span_processor(
    BatchSpanProcessor(
        space_id=os.environ["ARIZE_SPACE_ID"],
        api_key=os.environ["ARIZE_API_KEY"],
    )
)

otel_trace.set_tracer_provider(tracer_provider)

# enable_sensitive_data=True captures message content in spans. Required for
# full observability; only enable when you trust the trace destination.
enable_instrumentation(enable_sensitive_data=True)
print("Arize AX tracing initialized for Microsoft Agent Framework.")

Run Microsoft Agent Framework

# example.py

# Importing instrumentation first ensures tracing is set up
# before `agent_framework` is imported.
from instrumentation import tracer_provider

import asyncio

from agent_framework import ChatAgent
from agent_framework.openai import OpenAIChatClient


async def main() -> None:
    # OpenAIChatClient reads OPENAI_API_KEY from the environment.
    agent = ChatAgent(
        chat_client=OpenAIChatClient(model_id="gpt-5"),
        instructions="You are a concise factual assistant.",
    )
    response = await agent.run(
        "Why is the ocean salty? Answer in two sentences."
    )
    print(response.text)


asyncio.run(main())

Expected output

Arize AX tracing initialized for Microsoft Agent Framework.
The ocean is salty because rivers continuously dissolve mineral salts from rocks and soil and carry them to the sea, where they accumulate over millions of years. Water leaves the ocean through evaporation but the salts remain, steadily concentrating until reaching today's roughly 3.5% salinity.

Verify in Arize AX

  1. Open your Arize AX space and select project microsoft-agent-framework-tracing-example.
  2. You should see a new trace within ~30 seconds containing an invoke_agent parent span (emitted by Agent Framework, reshaped by the OpenInference processor) wrapping a chat-completion LLM child span with the prompt, response, and token usage attached.
  3. If no traces appear, see Troubleshooting.

Troubleshooting

  • No traces in Arize AX. Confirm ARIZE_SPACE_ID and ARIZE_API_KEY are set in the same shell that runs example.py. Enable OpenTelemetry debug logs with export OTEL_LOG_LEVEL=debug and re-run.
  • Code ran but no spans appear. enable_instrumentation() from agent_framework.observability must run after the global tracer provider is set. Confirm otel_trace.set_tracer_provider(tracer_provider) runs first, then enable_instrumentation(enable_sensitive_data=True).
  • Spans missing message content. Pass enable_sensitive_data=True to enable_instrumentation(). Without it, prompts and responses are stripped from the spans.
  • 401 from OpenAI. Verify OPENAI_API_KEY is set and has access to gpt-5. Swap for a model your key can call.
  • API breaking changes. Microsoft Agent Framework is in beta; pin agent-framework==1.0.0b260130 if a newer release breaks the example.
  • Other LLM providers. Use agent_framework.azure.AzureOpenAIChatClient for Azure OpenAI (set AZURE_OPENAI_API_KEY + AZURE_OPENAI_ENDPOINT + AZURE_OPENAI_API_VERSION) or other connectors per the framework docs.

Resources

Microsoft Agent Framework Documentation

OpenInference Agent Framework Span Processor

Microsoft Agent Framework GitHub