Strategic alliance and joint product promises to broaden the adoption of generative AI across industries
San Francisco, CA, July 11, 2024 – Arize AI, a pioneer and leader in AI observability and LLM evaluation, and LlamaIndex, a leading data framework for LLM applications, debuted a new joint offering today called LlamaTrace, a hosted version of Arize OSS Phoenix.
According to a soon-to-release survey, 47.7% of AI engineers and developers building generative AI applications are leveraging retrieval today in their LLM Applications. By connecting data to generative AI, orchestration frameworks like LlamaIndex can be game-changers in accelerating generative AI development. However, for many teams and enterprises technical challenges remain in getting modern LLM systems – with layers of abstraction – ready for the real world.
To help, Arize and LlamaIndex are debuting an LLM tracing and observability platform that works natively with the LlamaIndex and Arize ecosystem. With a foundation based on Arize Phoenix OSS, the hosted version of Phoenix offers the ability to persist application telemetry data generated during AI development in order to better experiment, iterate, and collaborate in development or production.
The solution has a foundation in open source and features a fully hosted, online, persistent deployment option for teams that do not want to self host. AI engineers can instantly log traces, persist datasets, run experiments, run evaluations – and share those insights with colleagues.
The new offering is available today, and can be accessed through either a LlamaIndex or Arize account.
“We share a vision with LlamaIndex in enabling builders to reduce the time it takes to deploy generative AI into production but in a way that is super battle hardened for business-critical use cases,” said Jason Lopatecki, CEO and Co-Founder of Arize. “As leaders in our respective spaces with a common philosophy in empowering AI engineers and developers, we’re uniquely positioned here to do something that can move modern LLMOps forward and broaden adoption.”
“Prototyping a RAG pipeline or agent is easy, but every AI engineer needs the right data processing layer, orchestration framework, and experimentation/monitoring tool in order to take these applications to production. LlamaTrace by Arize offers the richest toolkit we’ve seen in enabling developers to observe, debug, and evaluate every granular step of a very complex LLM workflow, and it nicely complements the production-ready data platform and orchestration framework that LlamaCloud and LlamaIndex offer,” said Jerry Liu, CEO of LlamaIndex.
About Arize AI
Arize AI is an AI observability and LLM evaluation platform that helps teams deliver and maintain more successful AI in production. Arize’s automated monitoring and observability platform allows teams to quickly detect issues when they emerge, troubleshoot why they happened, and improve overall performance across both traditional ML and generative use cases. Arize is headquartered in Berkeley, CA
About LlamaIndex
LlamaIndex is a data framework and platform which lets developers easily build LLM applications over their data. LlamaIndex provides an enterprise offering, LlamaCloud, which lets developers efficiently parse, index, and retrieve over a wide range of data sources. Developers can then use the core open-source framework to orchestrate workflows with LLMs to build production-grade applications, ranging from question-answering chatbots to document extraction and summarization to autonomous agents.