LangFlow
Langflow is an open-source visual framework that enables developers to rapidly design, prototype, and deploy custom applications powered by large language models (LLMs). Built on top of LangChain,
Langflow users can now seamlessly observe their LLM workflows through Arize Phoenix. This integration allows developers to gain granular visibility into the performance and behavior of their Langflow applications. By leveraging Arize AI's observability platform, users can capture detailed telemetry data from their Langflow pipelines, enabling them to identify bottlenecks, trace the flow of requests, and ensure the reliability and efficiency of their LLM-powered systems. This enhanced observability empowers teams to debug issues faster, optimize performance, and maintain high-quality user experiences across their LLM applications.
Pull Langflow Repo
Navigate to the Langflow GitHub repo and pull the project down
Create .env file
Navigate to the repo and create a .env
file with all the Arize Phoenix variables.
You can use the .env.example
as a template to create the .env
file

Add the following environment variable to the .env
file
Note: This Langflow integration is for Phoenix Cloud
Start Docker Desktop
Start Docker Desktop, build the images, and run the container (this will take around 10 minutes the first time) Go into your terminal into the Langflow directory and run the following commands
Go to Hosted Langflow UI

Create a Flow
In this example, we'll use Simple Agent for this tutorial

Add your OpenAI Key to the Agent component in Langflow

Go into the Playground and run the Agent

Go to Arize Phoenix
Navigate to your project name (should match the name of of your Langflow Agent name)
https://app.phoenix.arize.com/


Inspect Traces

AgentExecutor Trace is Arize Phoenix instrumentation to capture what's happening with the LangChain being ran during the Langflow components

The other UUID trace is the native Langflow tracing.
Last updated
Was this helpful?