VertexAI Tracing
Instrument LLM calls made with the Vertex AI SDK (e.g., Gemini models) using OpenInference and view traces in Arize.
The Vertex AI SDK, particularly for models like Gemini, can be instrumented using the openinference-instrumentation-vertexai
package. Traces can be sent to Arize for observability.
Launch Arize
To get started, sign up for a free Arize account and get your Space ID and API Key.
Install
Note: google-cloud-aiplatform
includes the vertexai
SDK.
Environment and Authentication Setup
Before running your code, ensure your Google Cloud environment is authenticated and configured for Vertex AI. This typically involves:
Authenticating via the Google Cloud CLI:
gcloud auth application-default login
Setting your Google Cloud Project ID. You can do this via an environment variable:
The
vertexai.init(project='your-gcp-project-id', location='your-region')
call in your Python code will also use this project ID if not set by other means.
Refer to Google's official Vertex AI documentation for the most current and detailed setup instructions.
Setup Tracing
Use the register
function to connect your application to Arize and instrument the Vertex AI client.
Run VertexAI
Initialize Vertex AI and use a generative model. Ensure your project and location are correctly configured.
Observe
Now that you have tracing setup, all instrumented calls to Vertex AI models (like Gemini generate_content
) will be streamed to your Arize account for observability and evaluation.
Resources
Last updated
Was this helpful?