Skip to main content

Documentation Index

Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

LangChain is a Python framework for composing LLM applications — chains, agents, RAG, tool use. Arize AX captures every LangChain run — prompt templates, chat-model calls, retrievals, tools, and the runnable hierarchy that wraps them — via the openinference-instrumentation-langchain package.
https://storage.googleapis.com/arize-phoenix-assets/assets/images/phoenix-docs-images/gc.ico

LangChain Tracing Tutorial (Google Colab)

Prerequisites

Launch Arize AX

  1. Sign in to your Arize AX account.
  2. From Space Settings, copy your Space ID and API Key. You will set them as ARIZE_SPACE_ID and ARIZE_API_KEY below.

Install

pip install arize-otel \
  openinference-instrumentation-langchain \
  langchain langchain-openai

Configure credentials

export ARIZE_SPACE_ID="<your-space-id>"
export ARIZE_API_KEY="<your-api-key>"
export ARIZE_PROJECT_NAME="langchain-tracing-example"
export OPENAI_API_KEY="<your-openai-api-key>"

Setup tracing

# instrumentation.py
import os

from arize.otel import register
from openinference.instrumentation.langchain import LangChainInstrumentor

tracer_provider = register(
    space_id=os.environ["ARIZE_SPACE_ID"],
    api_key=os.environ["ARIZE_API_KEY"],
    project_name=os.environ["ARIZE_PROJECT_NAME"],
)

LangChainInstrumentor().instrument(tracer_provider=tracer_provider)
print("Arize AX tracing initialized for LangChain.")

Run LangChain

# example.py

# Importing instrumentation first ensures tracing is set up
# before `langchain_openai` is imported.
from instrumentation import tracer_provider

from langchain_core.prompts import ChatPromptTemplate
from langchain_openai import ChatOpenAI

# ChatOpenAI reads OPENAI_API_KEY from the environment.
prompt = ChatPromptTemplate.from_template(
    "Why is the {celestial_object} {color}? Answer in two sentences."
)
chain = prompt | ChatOpenAI(model="gpt-5")

response = chain.invoke({"celestial_object": "sky", "color": "blue"})

print(response.content)

Expected output

Arize AX tracing initialized for LangChain.
The sky appears blue because of Rayleigh scattering — molecules in the atmosphere scatter shorter (blue) wavelengths of sunlight more than longer ones, so the diffuse light reaching your eyes is dominated by blue. At sunrise and sunset, light travels through more atmosphere, so the blue is scattered out and reds dominate.

Verify in Arize AX

  1. Open your Arize AX space and select project langchain-tracing-example.
  2. You should see a new trace within ~30 seconds containing a RunnableSequence parent span (the LCEL chain) wrapping ChatPromptTemplate and ChatOpenAI child spans, with the prompt, response, and token usage attached.
  3. If no traces appear, see Troubleshooting.

Troubleshooting

  • No traces in Arize AX. Confirm ARIZE_SPACE_ID and ARIZE_API_KEY are set in the same shell that runs example.py. Enable OpenTelemetry debug logs with export OTEL_LOG_LEVEL=debug and re-run.
  • LangChain spans missing but other spans present. LangChainInstrumentor().instrument(...) must run before any langchain import. Make sure instrumentation.py is the first import in your entry point.
  • 401 from OpenAI. Verify OPENAI_API_KEY is set and has access to gpt-5. Swap for a model your key can call.
  • Other LLM providers. Install the matching langchain-<provider> package (e.g. langchain-anthropic, langchain-google-genai) and replace ChatOpenAI with the equivalent chat model. The same LangChainInstrumentor covers every provider.

Resources

LangChain Python Documentation

OpenInference LangChain Instrumentor

Arize AX LangChain Tutorials

LangChain.js Tracing