MistralAI Tracing
Instrument LLM calls made using MistralAI's SDK with OpenInference and view traces in Arize.
MistralAI is a leading provider for state-of-the-art LLMs. The MistralAI SDK can be instrumented using the openinference-instrumentation-mistralai
package. Traces can be viewed in Arize.
Launch Arize
To get started, sign up for a free Arize account and get your Space ID and API Key.
Install
pip install openinference-instrumentation-mistralai mistralai arize-otel
API Key Setup
Set the MISTRAL_API_KEY
environment variable to authenticate calls made using the SDK.
export MISTRAL_API_KEY='your_mistral_api_key'
Setup
Connect to Arize using the register
function and instrument the MistralAI client.
# Import open-telemetry dependencies
from arize.otel import register
# Setup OTel via our convenience function
tracer_provider = register(
space_id = "your-space-id", # in app space settings page
api_key = "your-api-key", # in app space settings page
project_name = "your-project-name", # name this to whatever you would like
)
# Import openinference instrumentor to map Mistral traces to a standard format
from openinference.instrumentation.mistralai import MistralAIInstrumentor
# Turn on the instrumentor
MistralAIInstrumentor().instrument(tracer_provider=tracer_provider)
Run Mistral
Now you can use the MistralAI client as usual. It will automatically pick up the MISTRAL_API_KEY
from your environment.
import os
from mistralai import Mistral
from mistralai.models import UserMessage
api_key = os.environ["MISTRAL_API_KEY"]
model = "mistral-tiny"
client = Mistral(api_key=api_key)
chat_response = client.chat.complete(
model=model,
messages=[UserMessage(content="What is the best French cheese?")],
)
print(chat_response.choices[0].message.content)
Observe
Now that you have tracing setup, all invocations of Mistral (completions, chat completions, embeddings) will be streamed to your Arize account for observability and evaluation.
Resources
Last updated
Was this helpful?