Modelbit + Arize: Enabling Rapid ML Model Deployment and Monitoring

Michael Butler

Contributor

This is a guest post authored by Michael Butler from Modelbit

Seemingly every day a new open source model is announced with the potential to outperform any of the models that ML teams have already spent months setting up in production.

Websites like Hugging Face have made it easy to pull a new model down from the hub and fine tune it in a Jupyter notebook with training data. The challenge isn’t learning about these new model technologies and coming up with hypotheses about the impact they could have in your product.

One challenge is the amount of work that goes into building the infrastructure to deploy these models and monitor their performance in production.

Many machine learning teams have spent an immense amount of time building custom pipelines. These home-grown pipelines work well enough for a few models, but any attempt to deploy and monitor newer models would require another herculean effort to rebuild a custom pipeline.

Modelbit and Arize’s new integration enables teams to rapidly deploy ML models into production with one line of code and begin monitoring and fine tuning instantly. Below, we’ll walk through how to rapidly deploy models into production with Modelbit, and immediately monitor their performance with Arize.

Using Modelbit and Arize Together

If you aren’t already a user, you’ll need to create a free account with Arize and Modelbit. Once you’ve got your accounts set up, the integration can be created in a few short steps.

Here are the steps we’ll cover:

  • How to configure your notebook environment
  • Deploying an ML model to REST with Modelbit
  • How to send your model’s inferences from Modelbit to Arize

Step 1: Adding Arize keys to Modelbit​

To add your Arize Space and API keys to Modelbit:

In your Arize account, locate your Space Key and API Key in Arize on the Space Settings page.

Next, in Modelbit, click the Arize integration in Settings and add your keys.

modelbit add keys

integrating arize

Step 2: Setting up your notebook environment​

Now it’s time to set up the notebook environment. To make development easier in your notebook environment, set the envvars ARIZE_SPACE_KEY and ARIZE_API_KEY to your Arize Space and API keys.

Alternatively, you can use Modelbit to set those environment variables:

import os

os.environ["ARIZE_SPACE_KEY"] = mb.get_secret("ARIZE_SPACE_KEY")
os.environ["ARIZE_API_KEY"] = mb.get_secret("ARIZE_API_KEY")

Step 3: Logging inferences to Arize​

In order to start logging inferences to Arize, you’ll need to define a function that will log your inference results to Arize.

from arize.api import Client
from arize.utils.types import ModelTypes, Environments

def log_to_arize(features, prediction):
    arize_resp = Client().log(
        model_id='sample-model-1',
        model_type=ModelTypes.SCORE_CATEGORICAL,
        environment=Environments.PRODUCTION,
        features=features,
        prediction_label=prediction,
    ).result()
    if arize_resp.status_code != 200:
        print(f'Arize logging failed: {arize_resp.text}')

As a next step, define your inference function that logs its results to Arize by calling log_to_arize:

def example_arize(features):
# first, calculate your inference
prediction = ('Fraud', 0.4) # This might be "model.predict(features...)" in your code

# then log the inference to Arize
log_to_arize(features, prediction)

# after logging is complete, return the inference
return prediction

Finally, deploy your inference function to Modelbit. The call to mb.deploy will automatically include your log_to_arize function:

mb.deploy(example_arize)

That’s all there is to it. Now, whenever your Modelbit deployment produces an inference, that inference will be logged to Arize! With inferences being logged from Modelbit to Arize, you can now easily monitor, troubleshoot, and fine tune your models running in production. When you set up monitoring in Arize you can define custom thresholds and receive alerts over email, Slack, and other mediums when those thresholds are met. Arize even has features such as automated model retraining and the ability to export data back to Jupyter notebooks.

What makes the integration between Modelbit and Arize even more powerful is the ability to detect issues with your models in Arize, diagnose and fix these issues, and then easily deploy to production again using Modelbit.
arize model monitoring example

Try it today

Arize and Modelbit are both on a mission to help machine learning teams move faster and increase their ability to make an impact. Both Arize and Modelbit have free accounts, so give it a try and let us know what you think.