LangFlow Tracing
LangFlow is an open-source visual framework for designing, prototyping, and deploying LLM applications, built on LangChain. Integrating LangFlow with Arize allows you to observe your LLM workflows, gaining visibility into performance, identifying bottlenecks, and ensuring reliability.
This guide explains how to configure LangFlow to send trace data to Arize.
Pull LangFlow Repo
If you haven't already, navigate to the LangFlow GitHub repo and pull the project down:
Create .env file for Arize Integration
Inside your cloned LangFlow repository, create or update the .env
file. LangFlow uses this file to configure its OpenTelemetry exports.
You can use the .env.example
file in the LangFlow repository as a template.
Add the following environment variables to your .env
file to send traces to Arize:
Replace
YOUR_ARIZE_SPACE_ID
andYOUR_ARIZE_API_KEY
with your actual Arize Space ID and API Key, found in your Arize account settings.LangFlow will also need API keys for any LLMs or tools you use within your flows (e.g.,
OPENAI_API_KEY
). Ensure these are also present in the.env
file or otherwise configured in your LangFlow environment.
LangFlow should pick up these environment variables to configure its OpenTelemetry exporter to send data to Arize. Refer to the LangFlow documentation for specifics on how it handles OTel configuration via environment variables, especially the OTLP endpoint if it needs to be explicitly set (for Arize, it's typically https://otlp.arize.com/v1
).
Start Docker Desktop & LangFlow
Start Docker Desktop.
Navigate to your LangFlow directory in the terminal.
Build the images and run the container. This might take around 10 minutes the first time.
Go to Hosted LangFlow UI
Once the Docker container is running, access your local LangFlow UI, usually at:
Create and Run a Flow
Design or open a flow in LangFlow. For example, you can use a "Simple Agent" or any other flow that involves LLM calls.
Ensure any components requiring API keys (like an OpenAI node) are configured correctly (either via the UI or by ensuring LangFlow can access the keys from the
.env
file).Go into the Playground section for your flow and run it by sending a message or triggering its execution.
View Traces in Arize
After your LangFlow application runs, navigate to your Arize account:
Find the project associated with your LangFlow traces. The project name in Arize might correspond to how LangFlow names its services or might be configurable via environment variables (check LangFlow's OTel documentation).
Inspecting Traces
In Arize, you should see traces from your LangFlow application. LangFlow, being built on LangChain, may produce traces that look similar to LangChain traces (e.g., "AgentExecutor" traces if you are using LangChain agents within LangFlow). LangFlow itself might also produce its own native trace spans.
AgentExecutor Trace (or similar): These represent the LangChain operations occurring within your LangFlow components, captured via the underlying LangChain instrumentation that LangFlow utilizes.
Native LangFlow Tracing: LangFlow might also generate its own spans representing the execution of its components or the overall flow.
By examining these traces in Arize, you can understand the behavior of your LangFlow application, debug issues, and monitor performance.
Resources
LangFlow Documentation (Refer to their documentation for the most up-to-date details on OpenTelemetry configuration)
Last updated
Was this helpful?