LangGraph Tracing

How to trace LangGraph applications with OpenInference and send data to Arize AX.

LangGraph is a library for building stateful, multi-actor applications with LLMs, built on top of LangChain. Arize allows you to observe your LangGraph applications by leveraging the OpenInference LangChainInstrumentor.

Tutorial on instrumenting a LangGraph application and sending traces to Arize AX.

As LangGraph is an extension of LangChain, the same LangChainInstrumentor from OpenInference is used to capture traces. If you've already set up instrumentation for LangChain as described in the LangChain Tracing with Arize AX guide, your LangGraph applications will also be traced.

API Key Setup

Before running your application, ensure you have the following environment variables set:

You can find your Arize Space ID and API Key in your Arize account settings.

Install Packages

Install the necessary packages for LangGraph, LangChain, OpenInference instrumentors, Arize OTel, and OpenTelemetry:

Setup Tracing

Use arize.otel.register to configure the OpenTelemetry tracer. Then, apply the LangChainInstrumentor (which covers LangGraph) and, optionally, the OpenAIInstrumentor if you are using OpenAI directly or want deeper OpenAI traces.

Run LangGraph Example

By instrumenting LangChain, spans from your LangGraph applications will be created and sent to Arize AX. Here is an example from the Arize documentation:

Observe in Arize AX

Now that you have tracing set up, all invocations and steps within your LangGraph agents will be streamed to your Arize project for observability and evaluation.

Resources

Last updated

Was this helpful?