LangFlow Tracing

LangFlow is an open-source visual framework for designing, prototyping, and deploying LLM applications, built on LangChain. Integrating LangFlow with Arize allows you to observe your LLM workflows, gaining visibility into performance, identifying bottlenecks, and ensuring reliability.

This guide explains how to configure LangFlow to send trace data to Arize.

Pull LangFlow Repo

If you haven't already, navigate to the LangFlow GitHub repo and pull the project down:

Create .env file for Arize Integration

Inside your cloned LangFlow repository, create or update the .env file. LangFlow uses this file to configure its OpenTelemetry exports.

You can use the .env.example file in the LangFlow repository as a template.

Add the following environment variables to your .env file to send traces to Arize:

# Arize Environment Variables for LangFlow
ARIZE_SPACE_ID="YOUR_ARIZE_SPACE_ID"
ARIZE_API_KEY="YOUR_ARIZE_API_KEY"
# OPENAI_API_KEY="YOUR_OPENAI_API_KEY" # Add if your LangFlow flows use OpenAI
# Add other necessary API keys for services used in your flows
  • Replace YOUR_ARIZE_SPACE_ID and YOUR_ARIZE_API_KEY with your actual Arize Space ID and API Key, found in your Arize account settings.

  • LangFlow will also need API keys for any LLMs or tools you use within your flows (e.g., OPENAI_API_KEY). Ensure these are also present in the .env file or otherwise configured in your LangFlow environment.

LangFlow should pick up these environment variables to configure its OpenTelemetry exporter to send data to Arize. Refer to the LangFlow documentation for specifics on how it handles OTel configuration via environment variables, especially the OTLP endpoint if it needs to be explicitly set (for Arize, it's typically https://otlp.arize.com/v1).

Start Docker Desktop & LangFlow

  1. Start Docker Desktop.

  2. Navigate to your LangFlow directory in the terminal.

  3. Build the images and run the container. This might take around 10 minutes the first time.

docker compose -f docker/dev.docker-compose.yml down || true
docker compose -f docker/dev.docker-compose.yml up --remove-orphans

Go to Hosted LangFlow UI

Once the Docker container is running, access your local LangFlow UI, usually at:

Create and Run a Flow

  1. Design or open a flow in LangFlow. For example, you can use a "Simple Agent" or any other flow that involves LLM calls.

  2. Ensure any components requiring API keys (like an OpenAI node) are configured correctly (either via the UI or by ensuring LangFlow can access the keys from the .env file).

  3. Go into the Playground section for your flow and run it by sending a message or triggering its execution.

View Traces in Arize

After your LangFlow application runs, navigate to your Arize account:

Find the project associated with your LangFlow traces. The project name in Arize might correspond to how LangFlow names its services or might be configurable via environment variables (check LangFlow's OTel documentation).

Inspecting Traces

In Arize, you should see traces from your LangFlow application. LangFlow, being built on LangChain, may produce traces that look similar to LangChain traces (e.g., "AgentExecutor" traces if you are using LangChain agents within LangFlow). LangFlow itself might also produce its own native trace spans.

  • AgentExecutor Trace (or similar): These represent the LangChain operations occurring within your LangFlow components, captured via the underlying LangChain instrumentation that LangFlow utilizes.

  • Native LangFlow Tracing: LangFlow might also generate its own spans representing the execution of its components or the overall flow.

By examining these traces in Arize, you can understand the behavior of your LangFlow application, debug issues, and monitor performance.

Resources

Last updated

Was this helpful?