Using Google Gen AI with Phoenix Evals
Need to install the extra dependency google-genai>=1.0.0
Use the LLM wrapper with provider="google" to access Google’s Gemini models through the Google GenAI SDK.
Authentication Options
Option 1: Using API Key (Developer API)
Set the GOOGLE_API_KEY or GEMINI_API_KEY environment variable:
export GOOGLE_API_KEY=your_api_key_here
from phoenix.evals import LLM
llm = LLM(provider="google", model="gemini-2.5-flash")
Option 2: Using VertexAI
from phoenix.evals import LLM
llm = LLM(
provider="google",
model="gemini-2.5-flash",
vertexai=True,
project="your-project-id",
location="us-central1",
)
Basic Usage
from phoenix.evals import LLM
from phoenix.evals.metrics import FaithfulnessEvaluator
llm = LLM(provider="google", model="gemini-2.5-flash")
evaluator = FaithfulnessEvaluator(llm=llm)
result = evaluator.evaluate(eval_input={
"input": "What is the capital of France?",
"output": "Paris is the capital of France.",
"context": "Paris is the capital and largest city of France.",
})
result[0].pretty_print()
Supported Models
The Google provider supports all Gemini models available through the Google GenAI SDK, including:
gemini-2.5-flash (default)
gemini-2.5-flash-001
gemini-2.0-flash-001
gemini-1.5-pro
gemini-1.5-flash
We acknowledge Siddharth Sahu for this valuable contribution and support.