Google Gen AI Tracing
Instrument LLM calls made using the Google Gen AI Python SDK
Launch Phoenix
Sign up for Phoenix:
Sign up for an Arize Phoenix account at https://app.phoenix.arize.com/login
Click
Create Space
, then follow the prompts to create and launch your space.
Install packages:
pip install arize-phoenix-otel
Set your Phoenix endpoint and API Key:
From your new Phoenix Space
Create your API key from the Settings page
Copy your
Hostname
from the Settings pageIn your code, set your endpoint and API key:
import os
os.environ["PHOENIX_API_KEY"] = "ADD YOUR PHOENIX API KEY"
os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "ADD YOUR PHOENIX HOSTNAME"
# If you created your Phoenix Cloud instance before June 24th, 2025,
# you also need to set the API key as a header:
# os.environ["PHOENIX_CLIENT_HEADERS"] = f"api_key={os.getenv('PHOENIX_API_KEY')}"
Install
pip install openinference-instrumentation-google-genai google-genai
Setup
Set the GEMINI_API_KEY
environment variable. To use the Gen AI SDK with Vertex AI instead of the Developer API, refer to Google's guide on setting the required environment variables.
export GEMINI_API_KEY=[your_key_here]
Use the register function to connect your application to Phoenix.
from phoenix.otel import register
# Configure the Phoenix tracer
tracer_provider = register(
project_name="my-llm-app", # Default is 'default'
auto_instrument=True # Auto-instrument your app based on installed OI dependencies
)
Observe
Now that you have tracing setup, all Gen AI SDK requests will be streamed to Phoenix for observability and evaluation.
import os
from google import genai
def send_message_multi_turn() -> tuple[str, str]:
client = genai.Client(api_key=os.environ["GEMINI_API_KEY"])
chat = client.chats.create(model="gemini-2.0-flash-001")
response1 = chat.send_message("What is the capital of France?")
response2 = chat.send_message("Why is the sky blue?")
return response1.text or "", response2.text or ""
Last updated
Was this helpful?