Skip to main content
The ax ai-integrations commands let you create and manage LLM provider integrations on the Arize platform. AI integrations store provider credentials and configuration used by evaluators and other Arize features.

ax ai-integrations list

List AI integrations, optionally filtered by space.
ax ai-integrations list [--space-id <id>] [--limit <n>] [--cursor <cursor>]
OptionDescription
--space-idFilter integrations by space ID
--limitMaximum number of results to return (default: 15)
--cursorPagination cursor for the next page
Examples:
ax ai-integrations list
ax ai-integrations list --space-id sp_abc123 --output integrations.json

ax ai-integrations get

Get an AI integration by ID.
ax ai-integrations get <integration-id>
Example:
ax ai-integrations get ai_abc123

ax ai-integrations create

Create a new AI integration.
ax ai-integrations create --name <name> --provider <provider> [options]
OptionDescription
--nameIntegration name (must be unique within the account)
--providerLLM provider: openAI, azureOpenAI, awsBedrock, vertexAI, anthropic, custom, nvidiaNim, gemini
--api-keyAPI key for the provider (write-only, never returned)
--base-urlCustom base URL for the provider
--model-nameSupported model name (repeat for multiple, e.g. --model-name gpt-4o --model-name gpt-4o-mini)
--enable-default-modelsEnable the provider’s default model list (flag)
--function-calling-enabledEnable function/tool calling (flag)
--auth-typeAuthentication type: default, proxy_with_headers, bearer_token
--headersCustom headers as a JSON object or path to a JSON file
--provider-metadataProvider-specific metadata as a JSON object or path to a JSON file
Provider-specific requirements:
ProviderRequired --provider-metadata fields
awsBedrock{"role_arn": "..."}
vertexAI{"project_id": "...", "location": "...", "project_access_label": "..."}
Examples:
# OpenAI integration
ax ai-integrations create \
  --name "OpenAI Production" \
  --provider openAI \
  --api-key sk-... \
  --model-name gpt-4o \
  --model-name gpt-4o-mini \
  --enable-default-models

# AWS Bedrock integration
ax ai-integrations create \
  --name "Bedrock Production" \
  --provider awsBedrock \
  --provider-metadata '{"role_arn": "arn:aws:iam::123456789:role/BedrockRole"}'

ax ai-integrations update

Update an AI integration by ID. Only the fields you provide are updated. At least one option must be specified.
ax ai-integrations update <integration-id> [options]
OptionDescription
--nameUpdated integration name
--providerUpdated LLM provider
--api-keyNew API key (write-only)
--base-urlUpdated custom base URL
--model-nameSupported model names (repeat for multiple; replaces existing list)
--enable-default-models / --no-enable-default-modelsEnable or disable the provider’s default model list
--function-calling-enabled / --no-function-calling-enabledEnable or disable function calling
--auth-typeUpdated authentication type
--headersCustom headers as JSON or file path. Pass 'null' to clear existing headers
--provider-metadataProvider-specific metadata as JSON or file path. Pass 'null' to clear
--scopingsVisibility scoping rules as a JSON array or file path (replaces all existing scopings)
Examples:
ax ai-integrations update ai_abc123 --name "OpenAI Staging"
ax ai-integrations update ai_abc123 --api-key sk-new-key
ax ai-integrations update ai_abc123 --model-name gpt-4o --model-name gpt-4.1
ax ai-integrations update ai_abc123 --scopings '[{"space_id": "sp_abc"}]'

ax ai-integrations delete

Delete an AI integration. This operation is irreversible.
ax ai-integrations delete <integration-id> [--force]
OptionDescription
--forceSkip the confirmation prompt
Examples:
ax ai-integrations delete ai_abc123
ax ai-integrations delete ai_abc123 --force