Custom LLM Models
Arize supports any model which has an OpenAI-compatible API.
Add your custom model endpoints to begin accessing your model in Arize's prompt playground. Arize makes use of the openai-client to make calls to these endpoints.
Guide
Adding a Custom Model Provider
Navigate to Settings -> AI Providers
Click Add New Integration
Select the provider which your Custom Model endpoint is compatible with
If your endpoint uses OpenAI SDK, select OpenAI. If it uses Azure OpenAI select Azure
Fill in the details of your endpoint
Name: the name you wish to give your endpoint - this is the name it will appear as in other areas of the UI such as in the prompt playground
Base Url: the url of your custom endpoint - Arize leverages the openai-client to make calls to your endpoint, so the relevant endpoints are automatically added for chat (/chat/completions) and completions(/completions) when used
Model Name: Select Add Custom Model Name, then add the exact name of the provider's model
API Key: key used to access your openai-compatible endpoint
Headers: optional headers to add in your request
You can now access your endpoint in Arize's prompt playground
You can navigate to the prompt playground by choosing a generative llm model which has prompts and responses and selecting a table row on the performance tracing tab, or by selecting prompt playground on an llm span
Choose the model provider you've specified, and toggle on the Use Chat switch to use the chat endpoint and off to use the legacy completions endpoint
Last updated
Was this helpful?

