Anthropic is a leading provider for state-of-the-art LLMs. The Anthropic SDK can be instrumented using the openinference-instrumentation-anthropic package.
Use the register function to connect your application to Phoenix:
A simple Anthropic application that is now instrumented
Now that you have tracing setup, all invocations of pipelines will be streamed to your running Phoenix for observability and evaluation.
pip install openinference-instrumentation-anthropic anthropicfrom phoenix.otel import register
# configure the Phoenix tracer
tracer_provider = register(
project_name="my-llm-app", # Default is 'default'
auto_instrument=True # Auto-instrument your app based on installed OI dependencies
)import anthropic
client = anthropic.Anthropic()
message = client.messages.create(
model="claude-3-5-sonnet-20240620",
max_tokens=1000,
temperature=0,
messages=[
{
"role": "user",
"content": [
{
"type": "text",
"text": "Why is the ocean salty?"
}
]
}
]
)
print(message.content)