LiteLLM Tracing
Instrument LiteLLM calls using OpenInference. Traces are fully OpenTelemetry compatible and can be sent to Arize for viewing.
LiteLLM allows developers to call all LLM APIs using the OpenAI format. LiteLLM Proxy is a proxy server to call 100+ LLMs in OpenAI format. Both are supported by this OpenInference auto-instrumentation, and traces can be viewed in Arize.
Any calls made to the following functions will be automatically captured by this integration:
completion()
acompletion()
completion_with_retries()
embedding()
aembedding()
image_generation()
aimage_generation()
Launch Arize
To get started, sign up for a free Arize account and get your Space ID and API Key.
Install
Setup
Use the register
function to connect your application to Arize and instrument LiteLLM.
Add any API keys needed by the models you are using with LiteLLM. For example, if using an OpenAI model:
Run LiteLLM
You can now use LiteLLM as normal and calls will be traced in Arize.
Observe
Traces should now be visible in your Arize account!
Resources
Last updated
Was this helpful?