Documentation Index
Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
The openinference-instrumentation-annotation package adds annotation-driven tracing to any Java application. Annotate methods with @Chain, @LLM, @Tool, @Agent, or @Span and a ByteBuddy agent intercepts them at class load to produce OpenInference spans backed by OpenTelemetry — useful when you’ve built an agent or pipeline by hand and want OpenInference semantics without a framework-specific instrumentor.
Prerequisites
- Java 17 or higher
- OpenTelemetry Java SDK 1.49.0 or higher
- Arize AX account
Add Dependencies
Gradle
dependencies {
implementation 'com.arize:openinference-instrumentation-annotation:0.1.0'
}
Maven
<dependency>
<groupId>com.arize</groupId>
<artifactId>openinference-instrumentation-annotation</artifactId>
<version>0.1.0</version>
</dependency>
Setup Arize Credentials
export ARIZE_API_KEY="your-arize-api-key"
export ARIZE_SPACE_ID="your-arize-space-id"
Configuration for Arize Tracing
Install the ByteBuddy agent before any annotated classes are loaded, configure OpenTelemetry to export to Arize, then register the OITracer.
import com.arize.instrumentation.OITracer;
import com.arize.instrumentation.OpenInferenceAgent;
import com.arize.instrumentation.annotation.OpenInferenceAgentInstaller;
import io.opentelemetry.api.common.AttributeKey;
import io.opentelemetry.api.common.Attributes;
import io.opentelemetry.exporter.otlp.trace.OtlpGrpcSpanExporter;
import io.opentelemetry.sdk.resources.Resource;
import io.opentelemetry.sdk.trace.SdkTracerProvider;
import io.opentelemetry.sdk.trace.export.BatchSpanProcessor;
import java.util.Map;
// Install the ByteBuddy agent BEFORE loading any annotated classes
OpenInferenceAgentInstaller.install();
String apiKey = System.getenv("ARIZE_API_KEY");
String spaceId = System.getenv("ARIZE_SPACE_ID");
Resource resource = Resource.getDefault()
.merge(Resource.create(Attributes.of(
AttributeKey.stringKey("service.name"), "annotation-app",
AttributeKey.stringKey("openinference.project.name"), "my-annotation-app")));
OtlpGrpcSpanExporter exporter = OtlpGrpcSpanExporter.builder()
.setEndpoint("https://otlp.arize.com/v1")
.setHeaders(() -> Map.of(
"api_key", apiKey,
"arize-space-id", spaceId))
.build();
SdkTracerProvider tracerProvider = SdkTracerProvider.builder()
.addSpanProcessor(BatchSpanProcessor.builder(exporter).build())
.setResource(resource)
.build();
OITracer tracer = new OITracer(tracerProvider.get("annotation-app"));
OpenInferenceAgent.register(tracer);
To attach the agent at JVM startup instead (no OpenInferenceAgentInstaller.install() call required), build the jar from the openinference repo and pass it via -javaagent:
java -javaagent:./openinference-instrumentation-annotation.jar \
-cp your-app.jar com.example.Main
You still need to construct and register an OITracer during startup so the agent has a tracer to route spans through.
Run an Annotated Application
Annotate your methods with the span kind that fits each step. Parameters are automatically captured as input.value and the return value as output.value.
import com.arize.instrumentation.annotation.*;
import java.util.Map;
public class QAService {
@Agent(name = "qa-agent")
public String answer(String question) {
String context = retrieve(question);
Map<String, Object> weather = getWeather("San Francisco");
return generate(question, context, weather);
}
@Chain(name = "retriever")
public String retrieve(String query) {
return "OpenInference is an open standard for AI tracing.";
}
@Tool(name = "weather", description = "Gets current weather for a location")
public Map<String, Object> getWeather(String location) {
return Map.of("temp", 68, "condition", "foggy", "location", location);
}
@LLM(name = "generator")
public String generate(String question, String context, @ExcludeFromSpan Map<String, Object> weather) {
// @ExcludeFromSpan keeps the weather parameter out of the captured input
return callLLM(question, context);
}
}
Calling service.answer("What is OpenInference?") produces a nested trace:
qa-agent (AGENT)
├── retriever (CHAIN)
├── weather (TOOL)
└── generator (LLM)
Available annotations
| Annotation | Span Kind | Notes |
|---|
@Chain | CHAIN | — |
@LLM | LLM | — |
@Tool | TOOL | Accepts description |
@Agent | AGENT | — |
@Span | Any kind | Requires kind — use for RETRIEVER, EMBEDDING, etc. |
All annotations accept name, mapping, and outputMapping. Use @ExcludeFromSpan to drop a parameter from auto-captured input, and @SpanMapping to map parameters or return-value fields to specific OpenInference semantic convention attributes.
Parameter mappings rely on Java’s -parameters compiler flag. If your application isn’t compiled with that flag, reference the generated arg0, arg1, … names in your @SpanMapping annotations instead.
Shutdown
OpenInferenceAgent.unregister();
tracerProvider.forceFlush();
tracerProvider.shutdown();
Async and reactive frameworks
The ByteBuddy agent wraps each annotated method in an OpenTelemetry span by intercepting the call on the calling thread. OpenTelemetry context does not automatically follow execution across thread boundaries, so the annotation library can lose parent/child relationships when used with:
- Reactive frameworks (Project Reactor, RxJava, Mutiny)
CompletableFuture / ExecutorService-based async chains
- Kotlin coroutines
- Virtual threads where work is offloaded to a separate scheduler
If you need annotated tracing in these environments, propagate context explicitly with io.opentelemetry.context.Context.current().wrap(...) (or the equivalent reactor / coroutine helpers) when handing work to another thread, or fall back to the programmatic span API where you control span lifetimes directly. We recommend validating trace shape in a test environment before relying on annotations alone for async code paths.
Configuration
Suppress sensitive fields with TraceConfig:
import com.arize.instrumentation.TraceConfig;
TraceConfig config = TraceConfig.builder()
.hideInputs(true)
.hideOutputs(true)
.hideInputMessages(true)
.hideOutputMessages(true)
.hideToolParameters(true)
.hideOutputEmbeddings(true)
.build();
OITracer tracer = new OITracer(tracerProvider.get("annotation-app"), config);
Observe
Once configured, your annotated methods will emit OpenInference spans to Arize where you can:
- Monitor Performance: Track latency and errors per chain, tool, or LLM call
- Analyze Usage: View token usage and model metadata
- Debug Issues: Walk the agent → chain → tool → LLM hierarchy for each request
- Evaluate Quality: Run evaluations on captured inputs and outputs
Full example and source: openinference-instrumentation-annotation on GitHub.