Webinar

Best Practices Workshop Series: AI Model Monitoring & Optimization

Join us every Thursday for a 30-minute Arize overview session. We’ll answer your questions live and guide you through a specific use case every week.

This hands-on workshop will introduce Phoenix, Arize AI’s open-source library for ML observability in a notebook. We’ll first explain the concept of ML observability from first principles; at a high-level, a machine learning system is observable if you can not only detect data quality, drift, and performance issues in production (monitoring), but can also quickly identify the root-cause of the issue (root-cause analysis).You’ll see these concepts in action in the interactive portion of the workshop, where you’ll use Phoenix in an active learning workflow to:

  • Monitor an image classification model in production
  • Detect a production drift issue
  • Automatically identify and export problematic production data for labeling and fine-tuning of your image classification model
Amber Roberts
Machine Learning Engineer

Amber Roberts is an astrophysicist and machine learning engineer who was previously the Head of AI at Insight Data Science. Since then she has been at Splunk in their ML Product Org to build out ML feature solutions as a ML Product Manager. She now joins us at Arize as a ML Sales Engineer looking to help teams across industries build ML Observability into their productionalized AI environments.

Jack Zhou
Product Manager

Sally-Ann DeLucia
ML Solutions Engineer