Resource

Tracing LLM Function Calls in Arize

A quick demo of how to trace LLM function calls in Arize. Learn how to trace OpenAI function calls for enhanced debugging and structured outputs, and how function calling enables LLMs to interact with external tools and return structured data for tasks like summarization, classification, and code transformation. Arize simplifies the debugging process by logging chat history and function calls with a single line of code, offering a clear and readable format. Explore an example workflow using customer support chatbots and see how to refine and optimize parameters in the Prompt Playground.

Sign up for a free Arize account
Or check out our open-source tool Phoenix

Subscribe to our resources and blogs

Subscribe