Train Explain Model Locally And Deploy
Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the MIT License.
![]()
Train and explain models locally and deploy model and scoring explainer
This notebook illustrates how to use the Azure Machine Learning Interpretability SDK to deploy a locally-trained model and its corresponding scoring explainer to Azure Container Instances (ACI) as a web service.
Problem: IBM employee attrition classification with scikit-learn (train and explain a model locally and use Azure Container Instances (ACI) for deploying your model and its corresponding scoring explainer as a web service.)
Table of Contents
- Introduction
- Setup
- Run model explainer locally at training time
- Apply feature transformations
- Train a binary classification model
- Explain the model on raw features
- Generate global explanations
- Generate local explanations
- Visualize explanations
- Deploy model and scoring explainer
- Next steps
Introduction
This notebook showcases how to train and explain a classification model locally, and deploy the trained model and its corresponding explainer to Azure Container Instances (ACI). It demonstrates the API calls that you need to make to submit a run for training and explaining a model to AMLCompute, download the compute explanations remotely, and visualizing the global and local explanations via a visualization dashboard that provides an interactive way of discovering patterns in model predictions and downloaded explanations. It also demonstrates how to use Azure Machine Learning MLOps capabilities to deploy your model and its corresponding explainer.
We will showcase one of the tabular data explainers: TabularExplainer (SHAP) and follow these steps:
- Develop a machine learning script in Python which involves the training script and the explanation script.
- Run the script locally.
- Use the interpretability toolkit’s visualization dashboard to visualize predictions and their explanation. If the metrics and explanations don't indicate a desired outcome, loop back to step 1 and iterate on your scripts.
- After a satisfactory run is found, create a scoring explainer and register the persisted model and its corresponding explainer in the model registry.
- Develop a scoring script.
- Create an image and register it in the image registry.
- Deploy the image as a web service in Azure.
Setup
Make sure you go through the configuration notebook first if you haven't.
Initialize a Workspace
Initialize a workspace object from persisted configuration
Explain
Create An Experiment: Experiment is a logical container in an Azure ML Workspace. It hosts run records which can include run metrics and output artifacts from your experiments.
Visualize
Visualize the explanations
Deploy
Deploy Model and ScoringExplainer.
Please note that you must indicate azureml-defaults with verion >= 1.0.45 as a pip dependency, because it contains the functionality needed to host the model as a web service.
Next
Learn about other use cases of the explain package on a:
- Training time: regression problem
- Training time: binary classification problem
- Training time: multiclass classification problem
- Explain models with engineered features:
- Save model explanations via Azure Machine Learning Run History
- Run explainers remotely on Azure Machine Learning Compute (AMLCompute)
- Inferencing time: deploy a remotely-trained model and explainer
- Inferencing time: deploy a locally-trained keras model and explainer