Set the GOOGLE_API_KEY environment variable. Refer to Google’s ADK documentation for more details on authentication and environment variables.
Copy
Ask AI
export GOOGLE_API_KEY=[your_key_here]
Use the register function to connect your application to Phoenix.
Copy
Ask AI
from phoenix.otel import register# Configure the Phoenix tracertracer_provider = register( project_name="my-llm-app", # Default is 'default' auto_instrument=True # Auto-instrument your app based on installed OI dependencies)
Now that you have tracing setup, all Google ADK SDK requests will be streamed to Phoenix for observability and evaluation.
Copy
Ask AI
import asynciofrom google.adk.agents import Agentfrom google.adk.runners import InMemoryRunnerfrom google.genai import typesdef get_weather(city: str) -> dict: """Retrieves the current weather report for a specified city. Args: city (str): The name of the city for which to retrieve the weather report. Returns: dict: status and result or error msg. """ if city.lower() == "new york": return { "status": "success", "report": ( "The weather in New York is sunny with a temperature of 25 degrees" " Celsius (77 degrees Fahrenheit)." ), } else: return { "status": "error", "error_message": f"Weather information for '{city}' is not available.", }agent = Agent( name="test_agent", model="gemini-2.0-flash-exp", description="Agent to answer questions using tools.", instruction="You must use the available tools to find an answer.", tools=[get_weather])async def main(): app_name = "test_instrumentation" user_id = "test_user" session_id = "test_session" runner = InMemoryRunner(agent=agent, app_name=app_name) session_service = runner.session_service await session_service.create_session( app_name=app_name, user_id=user_id, session_id=session_id ) async for event in runner.run_async( user_id=user_id, session_id=session_id, new_message=types.Content(role="user", parts=[ types.Part(text="What is the weather in New York?")] ) ): if event.is_final_response(): print(event.content.parts[0].text.strip())if __name__ == "__main__": asyncio.run(main())
When using Vertex AI Agent Engine for remote deployment, instrumentation must be configured within the remote agent module, not in the main application code.
When deployed to Agent Engine, the Vertex AI framework aggressively manages the OpenTelemetry global state. If Phoenix uses the global TracerProvider, Vertex AI will automatically shut down the Phoenix export pipeline during container initialization, resulting in dropped traces and warnings. To avoid this, Phoenix must use an isolated (non-global) provider.Main Application:
Copy
Ask AI
from vertexai import agent_enginesremote_agent = agent_engines.create( agent_engine=ModuleAgent(module_name="adk_agent", agent_name="app"), requirements=[ "google-cloud-aiplatform[agent_engines,adk]", "arize-phoenix-otel", "openinference-instrumentation-google-adk", ], extra_packages=["adk_agent.py"], env_vars={ "PHOENIX_COLLECTOR_ENDPOINT": "https://app.phoenix.arize.com/s/<handle>/v1/traces", # Or your self-hosted Phoenix URL "PHOENIX_API_KEY": "<your-phoenix-api-key>", },)
Agent Module (adk_agent.py):
Copy
Ask AI
from phoenix.otel import registerfrom openinference.instrumentation.google_adk import GoogleADKInstrumentortracer_provider = register( project_name="adk-agent", batch=False, # Use sync export because Agent Engine pauses CPU after requests set_global_tracer_provider=False, # Required: avoids conflict with Agent Engine's global provider)GoogleADKInstrumentor().instrument(tracer_provider=tracer_provider)# Your agent code here...