Annotations API

Overview

Some users have internal workflows for annotating LLM data and need to streamline the process of exporting those annotated examples to the Arize platform. An example scenario involves non-technical subject matter experts using tools like Google Sheets to label LLM output with evaluation metrics. These labels can then be reviewed and analyzed by the technical team within the Arize platform. The GraphQL API enables users to programmatically export these annotation labels to Arize, facilitating further analysis and deeper insights.

For additional details around how to use annotations in the Arize platform, refer to the Annotations section of the docs.

Example of Annotations Data

Testing the GraphQL API Call

Users can test the annotations API call by navigating to https://app.arize.com/graphql.

UI to test GraphQL Calls

Mutation API to Add Annotations

Users can use a GraphQL mutation to add an annotation programmatically.

mutation addAnnotations($name: String!, $updatedBy: String, $label: String, $score: Float, $annotationType: AnnotationType!, $modelId: ID!, $note: String!, $modelEnvironment: ModelEnvironmentName!, $recordId: String!, $startTime:DateTime!) {
  updateAnnotations(
    input: {
      modelId: $modelId,
      note: {
        text: $note
      },
      annotations: {
        name: $name,
        updatedBy: $updatedBy,
        label: $label,
        score: $score,
        annotationType: $annotationType
      },
      modelEnvironment: $modelEnvironment,
      recordId: $recordId,
      startTime: $startTime
    } 
  ) {
    clientMutationId
  }
}

The Variables section contains the metadata that needs to be passed in:

{
  "name": "LLM Response Adequacy",
  "updatedBy": "Jane Doe",
  "label":"Needs Improvement",
  "annotationType": "label",
  "modelId": "TW9kZWw6MzExMjk3NjEwaweDpTZXdq",
  "note": "Although technically correct, this response lacks detail and does not sufficiently address the prompt.",
  "modelEnvironment": "tracing",
  "recordId": "e351b661b957e727",
  "startTime": "2024-11-15T10:15:30Z"
}

Below is a table describing each variable that is passed in above:

Variable Name
Description
Attribute Displayed in Arize UI

name

Required. A string defining the name of the metric, e.g. "LLM Response Adequacy".

updatedBy

Optional. A string identifying the user who is adding the annotation, such as a name or email address.

label

Optional. The binary annotation label associated with the span, e.g. "Satisfactory" vs. "Needs Improvement". Note that users may populate "label" or "score", but not both.

annotationType

Required. An enum specifying the type of the annotation, either "score" or "label". If "label", then the "label" field must also be populated. If "score", then the "score" field must be populated.

modelId

Required. The ID of the model associated with this annotation (can be retrieved from the URL).

note

Optional. A string containing a note from the annotator.

modelEnvironment

Required. Must be one of: production, validation, training, or tracing.

recordId

Required. The span ID of the span/trace to update annotations on.

startTime

Required. Should be at most 24 hours before the record start time. This is a filter applied to reduce the search space when looking for the record.

Last updated

Was this helpful?