Skip to main content
Use the classification helpers when you want an LLM to choose from a fixed set of labels and return a structured explanation.

Create A Classifier Function

import { openai } from "@ai-sdk/openai";
import { createClassifierFn } from "@arizeai/phoenix-evals";

const classify = createClassifierFn({
  model: openai("gpt-4o-mini"),
  choices: { relevant: 1, irrelevant: 0 },
  promptTemplate:
    "Question: {{input}}\nContext: {{context}}\nAnswer: {{output}}\nLabel as relevant or irrelevant.",
});

const result = await classify({
  input: "What is Phoenix?",
  context: "Phoenix is an AI observability platform.",
  output: "Phoenix helps teams inspect traces and experiments.",
});

Lower-Level API

Use generateClassification directly when you already have a rendered prompt and only need structured label generation.

Source Map

  • src/llm/createClassifierFn.ts
  • src/llm/createClassificationEvaluator.ts
  • src/llm/generateClassification.ts
  • src/types/evals.ts