Spring AI Tracing
How to use OpenInference instrumentation with Spring AI and export traces to Arize AX.
Prerequisites
Java 11 or higher
Arize AX account
Add Dependencies
1. Gradle
Add the dependencies to your build.gradle
:
dependencies {
implementation 'org.springframework.ai:spring-ai-starter-model-openai'
implementation 'io.micrometer:micrometer-tracing-bridge-brave:1.5.1'
implementation project(path: ':instrumentation:openinference-instrumentation-springAI')
// OpenTelemetry
implementation "io.opentelemetry:opentelemetry-sdk"
implementation "io.opentelemetry:opentelemetry-exporter-otlp"
implementation "io.opentelemetry:opentelemetry-exporter-logging"
testImplementation 'org.springframework.boot:spring-boot-starter-test'
testRuntimeOnly 'org.junit.platform:junit-platform-launcher'
}
Setup Arize Credentials
export ARIZE_API_KEY="your-arize-api-key"
export ARIZE_SPACE_ID="your-arize-space-id"
Configuration for Arize Tracing
private static void initializeOpenTelemetry() {
// Create resource with service name
Resource resource = Resource.getDefault()
.merge(Resource.create(Attributes.of(
AttributeKey.stringKey("service.name"), "spring-ai",
AttributeKey.stringKey(SEMRESATTRS_PROJECT_NAME), "spring-ai-project",
AttributeKey.stringKey("service.version"), "0.1.0")));
String apiKey = System.getenv("ARIZE_API_KEY");
String spaceId = System.getenv("ARIZE_SPACE_ID");
// Set up the OTLP exporter with the endpoint and additional headers
OtlpGrpcSpanExporter otlpExporter = OtlpGrpcSpanExporter.builder()
.setEndpoint("https://otlp.arize.com/v1")
.setHeaders(() -> Map.of(
"api_key", apiKey,
"arize-space-id", spaceId))
.build(); }
tracerProvider = SdkTracerProvider.builder()
.addSpanProcessor(BatchSpanProcessor.builder(otlpExporter)
.setScheduleDelay(Duration.ofSeconds(1))
.build())
.addSpanProcessor(SimpleSpanProcessor.create(LoggingSpanExporter.create()))
.setResource(resource)
.build();
// Build OpenTelemetry SDK
OpenTelemetrySdk.builder()
.setTracerProvider(tracerProvider)
.setPropagators(ContextPropagators.create(W3CTraceContextPropagator.getInstance()))
.buildAndRegisterGlobal();
System.out.println("OpenTelemetry initialized. Traces will be sent to Arize");
}
}
Run Spring AI
By instrumenting your application, spans will be created whenever it is run and will be sent to Arize for collection.
import com.arize.instrumentation.springAI.SpringAIInstrumentor;
import org.springframework.ai.openai.OpenAiChatModel;
initializeOpenTelemetry();
// 2. Create OITracer + instrumentor
OITracer tracer = new OITracer(tracerProvider.get("com.example.springai"), TraceConfig.getDefault());
ObservationRegistry registry = ObservationRegistry.create();
registry.observationConfig().observationHandler(new SpringAIInstrumentor(tracer));
// 3. Build Spring AI model
String apiKey = System.getenv("OPENAI_API_KEY");
OpenAiApi openAiApi = OpenAiApi.builder().apiKey(apiKey).build();
OpenAiChatOptions options = OpenAiChatOptions.builder().model("gpt-4").build();
OpenAiChatModel model = OpenAiChatModel.builder()
.openAiApi(openAiApi)
.defaultOptions(options)
.observationRegistry(registry)
.build();
// 4. Use it — traces are automatically created
ChatResponse response = model.call(new Prompt("What is the capital of France?"));
System.out.println("Response: " + response.getResult().getOutput().getContent());
Observe
Once configured, your OpenInference traces will be automatically sent to Arize where you can:
Monitor Performance: Track latency and errors
Analyze Usage: View token usage, model performance, and cost metrics
Debug Issues: Trace request flows and identify bottlenecks
Evaluate Quality: Run evaluations on your LLM outputs
Resources
Last updated
Was this helpful?