Anthropic Tracing
Instrument LLM calls made using Anthropic's SDK and view traces in Arize.
Anthropic is a leading provider for state-of-the-art LLMs. The Anthropic SDK can be instrumented using the openinference-instrumentation-anthropic
package. Traces can be viewed in Arize.
Launch Arize
To get started, sign up for a free Arize account and get your Space ID and API Key.
Install
pip install openinference-instrumentation-anthropic anthropic arize-otel opentelemetry-sdk
API Key Setup
Ensure your Anthropic API key is configured. The SDK typically uses the ANTHROPIC_API_KEY
environment variable:
export ANTHROPIC_API_KEY='your-anthropic-api-key'
Setup
Use the register
function to connect your application to Arize and instrument the Anthropic client:
# Import open-telemetry dependencies
from arize.otel import register
# Setup OTel via our convenience function
tracer_provider = register(
space_id = "your-space-id", # in app space settings page
api_key = "your-api-key", # in app space settings page
project_name = "your-project-name", # name this to whatever you would like
)
# Import openinference instrumentor to map Anthropic traces to a standard format
from openinference.instrumentation.anthropic import AnthropicInstrumentor
# Turn on the instrumentor
AnthropicInstrumentor().instrument(tracer_provider=tracer_provider)
Run Anthropic
A simple Anthropic application that is now instrumented:
import anthropic
client = anthropic.Anthropic() # The client will use the ANTHROPIC_API_KEY environment variable
message = client.messages.create(
model="claude-3-5-sonnet-20240620",
max_tokens=1000,
temperature=0,
messages=[
{
"role": "user",
"content": [
{
"type": "text",
"text": "Why is the ocean salty?"
}
]
}
]
)
print(message.content)
Observe
Now that you have tracing setup, all invocations will be streamed to your Arize account for observability and evaluation.
Resources:
Last updated
Was this helpful?