Available in Phoenix 8.26+
We’re excited to announce a powerful capability in the OpenInference OSS library openinference-instrumentation-mcp
— seamless OTEL context propagation for MCP clients and servers.
This release introduces automatic distributed tracing for Anthropic’s Model Context Protocol (MCP). Using OpenTelemetry, you can now:
Propagate context across MCP client-server boundaries
Generate end-to-end traces of your AI system across services and languages
Gain full visibility into how models access and use external context
The openinference-instrumentation-mcp
package handles this for you by:
Creating spans for MCP client operations
Injecting trace context into MCP requests
Extracting and continuing the trace context on the server
Associating the context with OTEL spans on the server side
Instrument both MCP client and server with OpenTelemetry.
Add the openinference-instrumentation-mcp
package.
Spans will propagate across services, appearing as a single connected trace in Phoenix.
Full example usage is available:
Big thanks to Adrian Cole and Anuraag Agrawal for their contributions to this feature.