LiteLLM
LiteLLM allows developers to call all LLM APIs using the openAI format. LiteLLM Proxy is a proxy server to call 100+ LLMs in OpenAI format. Both are supported by this auto-instrumentation.
Any calls made to the following functions will be automatically captured by this integration:
completion()
acompletion()
completion_with_retries()
embedding()
aembedding()
image_generation()
aimage_generation()
Launch Phoenix
Sign up for Phoenix:
Sign up for an Arize Phoenix account at https://app.phoenix.arize.com/login
Install packages:
Set your Phoenix endpoint and API Key:
Your Phoenix API key can be found on the Keys section of your dashboard.
Install
Setup
Use the register function to connect your application to Phoenix:
Add any API keys needed by the models you are using with LiteLLM.
Run LiteLLM
You can now use LiteLLM as normal and calls will be traces in Phoenix.
Observe
Traces should now be visible in Phoenix!

Resources
Last updated
Was this helpful?