Detecting, diagnosing and resolving ML model performance can be difficult for even the most sophisticated ML engineers. Join Arize co-founders Aparna Dhinakaran and Jason Lopatecki as they reflect on the evolution of ML observability since pioneering the space over one year ago and demo the Arize AI platform publicly for the first time.
In this session, we will explore:
The challenges of productionalizing ML
Why an evaluation store is becoming a critical piece of the ML infrastructure stack
The four pillars of ML observability and how to tackle each: drift, performance analysis, data quality and explainability