Amazon Bedrock Tracing
Instrument LLM calls to AWS Bedrock via the boto3 client using OpenInference and view traces in Arize.
boto3 provides Python bindings to AWS services, including Bedrock, which provides access to a number of foundation models. Calls to these models can be instrumented using OpenInference, enabling OpenTelemetry-compliant observability of applications built using these models. Traces collected using OpenInference can be viewed in Arize.
OpenInference Traces collect telemetry data about the execution of your LLM application. Consider using this instrumentation to understand how a Bedrock-managed models are being called inside a complex system and to troubleshoot issues such as extraction and response synthesis.
Launch Arize
To get started, sign up for a free Arize account and get your Space ID and API Key.
Install
pip install openinference-instrumentation-bedrock arize-otelSetup
Connect to Arize using the register function.
# Import open-telemetry dependencies
from arize.otel import register
# Setup OTel via our convenience function
tracer_provider = register(
space_id = "your-space-id", # in app space settings page
api_key = "your-api-key", # in app space settings page
project_name = "your-project-name", # name this to whatever you would like
)
# Import the automatic instrumentor from OpenInference
from openinference.instrumentation.bedrock import BedrockInstrumentor
# Start the instrumentor for Bedrock
BedrockInstrumentor().instrument(tracer_provider=tracer_provider)After instrumentation, initialize your boto3 client. All clients created after instrumentation will send traces on all calls to invoke_model.
import boto3
import os # For environment variables
import json # For handling response
# Ensure AWS credentials and region are set, e.g., via environment variables
# or other configuration methods compatible with boto3.
# Example assuming environment variables:
session = boto3.session.Session(
aws_access_key_id=os.environ.get("AWS_ACCESS_KEY_ID"),
aws_secret_access_key=os.environ.get("AWS_SECRET_ACCESS_KEY"),
aws_session_token=os.environ.get("AWS_SESSION_TOKEN"), # Optional, depending on your auth
region_name=os.environ.get("AWS_REGION_NAME")
)
client = session.client("bedrock-runtime")Run Bedrock
From here you can run Bedrock as normal. The Arize example includes both Converse API and Invoke Model API. We'll show the Invoke Model API here.
prompt = (
b'{"prompt": "Human: Hello there, how are you? Assistant:", "max_tokens_to_sample": 1024}'
)
response = client.invoke_model(modelId="anthropic.claude-v2", body=prompt)
response_body = json.loads(response.get("body").read())
print(response_body["completion"])⚠️ Warning: Use converse instead of invoke_model for Meta models on Amazon Bedrock.
Outputs from Meta models (such as Llama 3) are not currently traced when using the invoke_model API.
This issue is known, and a fix is actively in progress.
Observe
Now that you have tracing setup, all calls to invoke_model (or converse if using that API) will be streamed to your Arize account for observability and evaluation.
Resources
Last updated
Was this helpful?

