Documentation Index
Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
Manage connections to external LLM providers (OpenAI, Azure OpenAI, AWS Bedrock, Vertex AI, Anthropic, and others) for use within the Arize platform.
The ai_integrations client methods are currently in ALPHA. The API may change without notice. A one-time warning is emitted on first use.
Key Capabilities
- List AI integrations with optional filtering by space
- Retrieve integration details by ID
- Create integrations for any supported LLM provider
- Update integration settings without replacing the entire resource
- Delete integrations
Supported Providers
| Provider value | Description |
|---|
openAI | OpenAI |
azureOpenAI | Azure OpenAI |
awsBedrock | AWS Bedrock |
vertexAI | Google Vertex AI |
anthropic | Anthropic |
gemini | Google Gemini |
nvidiaNim | NVIDIA NIM |
custom | Custom provider |
List AI Integrations
List all AI integrations you have access to, with optional filtering by space or name.
resp = client.ai_integrations.list(
space="your-space-name-or-id", # optional
name="openai", # optional substring filter
limit=50,
)
for integration in resp.ai_integrations:
print(integration.id, integration.name, integration.provider)
For details on pagination, field introspection, and data conversion (to dict/JSON/DataFrame), see Response Objects.
Create an AI Integration
Integration names must be unique within the account.
OpenAI
from arize.ai_integrations.types import AiIntegrationProvider
integration = client.ai_integrations.create(
name="my-openai",
provider=AiIntegrationProvider.OPENAI,
api_key="sk-...",
model_names=["gpt-4o", "gpt-4o-mini"],
)
print(integration.id)
Azure OpenAI
from arize.ai_integrations.types import AiIntegrationProvider
integration = client.ai_integrations.create(
name="my-azure-openai",
provider=AiIntegrationProvider.AZUREOPENAI,
api_key="your-azure-key",
base_url="https://your-resource.openai.azure.com/",
model_names=["gpt-4o"],
)
AWS Bedrock
For AWS Bedrock, provider_metadata must include the "aws" kind discriminator and a role_arn.
from arize.ai_integrations.types import AiIntegrationProvider
integration = client.ai_integrations.create(
name="my-bedrock",
provider=AiIntegrationProvider.AWSBEDROCK,
provider_metadata={
"kind": "aws",
"role_arn": "arn:aws:iam::123456789012:role/my-role",
},
model_names=["anthropic.claude-3-5-sonnet-20241022-v2:0"],
)
Vertex AI
For Vertex AI, provider_metadata must include the "gcp" kind discriminator and project_id, location, and project_access_label.
from arize.ai_integrations.types import AiIntegrationProvider
integration = client.ai_integrations.create(
name="my-vertex-ai",
provider=AiIntegrationProvider.VERTEXAI,
provider_metadata={
"kind": "gcp",
"project_id": "my-gcp-project",
"location": "us-central1",
"project_access_label": "my-label",
},
model_names=["gemini-2.0-flash"],
)
Get an AI Integration
Retrieve a specific integration by name or ID.
integration = client.ai_integrations.get(integration="your-integration-name-or-id")
print(integration.id, integration.name)
Update an AI Integration
Only the fields you pass are updated — omitted fields are left unchanged. To explicitly clear a nullable field (e.g. api_key), pass None.
integration = client.ai_integrations.update(
integration="your-integration-name-or-id",
name="updated-name",
model_names=["gpt-4o", "gpt-4o-mini", "o3-mini"],
)
print(integration.name)
Delete an AI Integration
Delete an integration by name or ID. This operation is irreversible.
client.ai_integrations.delete(integration="your-integration-name-or-id")
print("Integration deleted")