Quickstart: Tracing (Python)
Overview
Phoenix supports three main options to collect traces:
Use Phoenix's decorators to mark functions and code blocks.
Use automatic instrumentation to capture all calls made to supported frameworks.
Use base OpenTelemetry instrumentation. Supported in Python and TS / JS, among many other languages.
This example uses options 1 and 2.
Launch Phoenix
Sign up for Phoenix:
Sign up for an Arize Phoenix account at https://app.phoenix.arize.com/login
Click
Create Space
, then follow the prompts to create and launch your space.
Install packages:
pip install arize-phoenix-otel
Set your Phoenix endpoint and API Key:
From your new Phoenix Space
Create your API key from the Settings page
Copy your
Hostname
from the Settings pageIn your code, set your endpoint and API key:
import os
os.environ["PHOENIX_API_KEY"] = "ADD YOUR PHOENIX API KEY"
os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "ADD YOUR PHOENIX HOSTNAME"
# If you created your Phoenix Cloud instance before June 24th, 2025,
# you also need to set the API key as a header
#os.environ["PHOENIX_CLIENT_HEADERS"] = f"api_key={os.getenv('PHOENIX_API_KEY')}"
Connect to Phoenix
To collect traces from your application, you must configure an OpenTelemetry TracerProvider to send traces to Phoenix.
pip install arize-phoenix-otel
from phoenix.otel import register
# configure the Phoenix tracer
tracer_provider = register(
project_name="my-llm-app", # Default is 'default'
auto_instrument=True, # See 'Trace all calls made to a library' below
)
tracer = tracer_provider.get_tracer(__name__)
Trace your own functions
Functions can be traced using decorators:
@tracer.chain
def my_func(input: str) -> str:
return "output"
Input and output attributes are set automatically based on my_func
's parameters and return.
Trace all calls made to a library
Phoenix can also capture all calls made to supported libraries automatically. Just install the respective OpenInference library:
pip install openinference-instrumentation-openai
OpenInference libraries must be installed before calling the register function
# Add OpenAI API Key
import os
import openai
os.environ["OPENAI_API_KEY"] = "ADD YOUR OPENAI API KEY"
client = openai.OpenAI()
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Write a haiku."}],
)
print(response.choices[0].message.content)
View your Traces in Phoenix
You should now see traces in Phoenix!

Next Steps
Explore tracing integrations
View use cases to see end-to-end examples
Last updated
Was this helpful?