LangChain is an open-source framework for building language model applications with prompt chaining, memory, and external integrations
How to use the python LangChainInstrumentor to trace LangChain
Phoenix has first-class support for LangChain applications.
pip install openinference-instrumentation-langchain langchain_openai
Use the register function to connect your application to Phoenix:
from phoenix.otel import register
# configure the Phoenix tracer
tracer_provider = register(
project_name="my-llm-app", # Default is 'default'
auto_instrument=True # Auto-instrument your app based on installed OI dependencies
)
By instrumenting LangChain, spans will be created whenever a chain is run and will be sent to the Phoenix server for collection.
from langchain_core.prompts import ChatPromptTemplate
from langchain_openai import ChatOpenAI
prompt = ChatPromptTemplate.from_template("{x} {y} {z}?").partial(x="why is", z="blue")
chain = prompt | ChatOpenAI(model_name="gpt-3.5-turbo")
chain.invoke(dict(y="sky"))
Now that you have tracing setup, all invocations of chains will be streamed to your running Phoenix for observability and evaluation.
This module provides automatic instrumentation for LangChain.js, more specifically, the @langchain/core module. which may be used in conjunction with @opentelemetry/sdk-trace-node.
npm install --save @arizeai/openinference-instrumentation-langchain
To load the LangChain instrumentation, manually instrument the @langchain/core/callbacks/manager
module. The callbacks manager must be manually instrumented due to the non-traditional module structure in @langchain/core
. Additional instrumentations can be registered as usual in the registerInstrumentations function.
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import {
LangChainInstrumentation
} from "@arizeai/openinference-instrumentation-langchain";
import * as CallbackManagerModule from "@langchain/core/callbacks/manager";
const provider = new NodeTracerProvider();
provider.register();
const lcInstrumentation = new LangChainInstrumentation();
// LangChain must be manually instrumented as it doesn't have
// a traditional module structure
lcInstrumentation.manuallyInstrument(CallbackManagerModule);
Instrumentation version >1.0.0 supports both attribute masking and context attribute propagation to spans.
>1.0.0
✅
✅
✅
>0.2.0
❌
✅
✅
>0.1.0
❌
❌
✅
You can specify a custom tracer provider for LangChain instrumentation in multiple ways:
const lcInstrumentation = new LangChainInstrumentation({
tracerProvider: customTracerProvider,
});
lcInstrumentation.manuallyInstrument(CallbackManagerModule);
const lcInstrumentation = new LangChainInstrumentation();
lcInstrumentation.setTracerProvider(customTracerProvider);
lcInstrumentation.manuallyInstrument(CallbackManagerModule);
const lcInstrumentation = new LangChainInstrumentation();
lcInstrumentation.manuallyInstrument(CallbackManagerModule);
registerInstrumentations({
instrumentations: [lcInstrumentation],
tracerProvider: customTracerProvider,
});
Sign up for Phoenix:
Sign up for an Arize Phoenix account at https://app.phoenix.arize.com/login
Click Create Space
, then follow the prompts to create and launch your space.
Install packages:
pip install arize-phoenix-otel
Set your Phoenix endpoint and API Key:
From your new Phoenix Space
Create your API key from the Settings page
Copy your Hostname
from the Settings page
In your code, set your endpoint and API key:
import os
os.environ["PHOENIX_API_KEY"] = "ADD YOUR PHOENIX API KEY"
os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "ADD YOUR PHOENIX HOSTNAME"
# If you created your Phoenix Cloud instance before June 24th, 2025,
# you also need to set the API key as a header:
# os.environ["PHOENIX_CLIENT_HEADERS"] = f"api_key={os.getenv('PHOENIX_API_KEY')}"
Launch your local Phoenix instance:
pip install arize-phoenix
phoenix serve
For details on customizing a local terminal deployment, see Terminal Setup.
Install packages:
pip install arize-phoenix-otel
Set your Phoenix endpoint:
import os
os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "http://localhost:6006"
See Terminal for more details.
Pull latest Phoenix image from Docker Hub:
docker pull arizephoenix/phoenix:latest
Run your containerized instance:
docker run -p 6006:6006 arizephoenix/phoenix:latest
This will expose the Phoenix on localhost:6006
Install packages:
pip install arize-phoenix-otel
Set your Phoenix endpoint:
import os
os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "http://localhost:6006"
For more info on using Phoenix with Docker, see Docker.
Install packages:
pip install arize-phoenix
Launch Phoenix:
import phoenix as px
px.launch_app()