How To Trace a Multimodal Query Application

The integration of Arize Phoenix with LlamaIndex's newly released instrumentation module can help you fine-tune performance, diagnose issues, and enhance the overall functionality of LLM applications.

In this tutorial leveraging the integration, we look at how to set up a multimodal query application using LlamaIndex, integrated with tracing capabilities provided by Phoenix. The application will use textual and image data to perform complex queries that combine both modalities.

🍳Notebook:
https://colab.research.google.com/gist/PubliusAu/11d0bbf67e8d35b9d8b08245ac941ed1/arizephoenixllamaindex_exanple.ipynb

Phoenix: https://phoenix.arize.com/

🦙More on LlamaIndex instrumentation:
https://docs.llamaindex.ai/en/stable/module_guides/observability/instrumentation/
https://docs.arize.com/phoenix/tracing/how-to-tracing/instrumentation/auto-instrument-python/llamaindex

Join community to ask questions: https://join.slack.com/t/arize-ai/shared_invite/zt-26zg4u3lw-OjUNoLvKQ2Yv53EfvxW6Kg

Connect with Evan: https://www.linkedin.com/in/evanjolley/

Subscribe to our resources and blogs

Subscribe