Instrument LLM calls to AWS Bedrock via the boto3 client using the BedrockInstrumentor
Amazon Bedrock Agents allow you to easily define, deploy, and manage agents on your AWS infrastructure. Traces on invocations of these agents can be captured using OpenInference and viewed in Phoenix.
This instrumentation will capture data on LLM calls, action group invocations (as tools), knowledgebase lookups, and more.
import os# Add Phoenix API Key for tracingPHOENIX_API_KEY ="ADD YOUR API KEY"os.environ["PHOENIX_CLIENT_HEADERS"]=f"api_key={PHOENIX_API_KEY}"os.environ["PHOENIX_COLLECTOR_ENDPOINT"]="https://app.phoenix.arize.com"
Your Phoenix API key can be found on the Keys section of your dashboard.
Launch your local Phoenix instance:
pipinstallarize-phoenixphoenixserve
For details on customizing a local terminal deployment, see Terminal Setup.
docker run -p 6006:6006 arizephoenix/phoenix:latest
This will expose the Phoenix on localhost:6006
Install packages:
pip install arize-phoenix-otel
Set your Phoenix endpoint:
import os
os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "http://localhost:6006"
For more info on using Phoenix with Docker, see Docker.
Install packages:
pip install arize-phoenix
Launch Phoenix:
import phoenix as px
px.launch_app()
By default, notebook instances do not have persistent storage, so your traces will disappear after the notebook is closed. See self-hosting or use one of the other deployment options to retain traces.
Install
pipinstallopeninference-instrumentation-bedrock
Setup
Connect to your Phoenix instance using the register function.
from phoenix.otel import register# configure the Phoenix tracertracer_provider =register( project_name="my-llm-app", # Default is 'default' auto_instrument=True# Auto-instrument your app based on installed OI dependencies)
After connecting to your Phoenix server, instrument boto3 prior to initializing a bedrock-runtime client. All clients created after instrumentation will send traces on all calls to invoke_model, invoke_agent, and their streaming variations.