Model Deployment and
Consump on
Once a predictive model has been developed and trained,
the following steps involve evaluating its performance,
deploying it for application use, and setting up ongoing
monitoring and management mechanisms.
Model Evalua on
• Understanding Model Performance: Evaluating a model's performance
involves using various metrics to understand how well it predicts
outcomes. Key metrics include the ROC curve, which helps assess true
positive rate against false positive rate; the precision-recall curve, which
is vital for understanding the trade-off between precision and recall; and
the confusion matrix, which provides insight into errors the model may
make.
• Tools for Evaluation: Azure Machine Learning Studio offers built-in
modules for generating these evaluation metrics, allowing for visual
assessment of model performance. This ability helps refine the model
further, if necessary, or validate its readiness for deployment.
11
oneneck.com
11
oneneck.com
06