Skip to main content


Understanding MLOps

Suggest edit Updated on January 19, 2022

The workflow for updating models by using Machine Learning Operations (MLOps) is a standardized process that data scientists can use to replace active models in predictions with other models, scorecards, or fields that represent scores, and to deploy the new models to the production environment.

Model update workflow

A model update process consists of the following stages:

  1. The process starts when you update a prediction in your non-production environment by replacing an active model with a predictive model, scorecard, or field in the data model that contains a score. You can select a model from Prediction Studio, upload a PMML and H2O model to Prediction Studio, or connect to an external model that you developed on Amazon SageMaker or Google AI Platform.

    When creating a machine learning model in a third-party environment, you can use historical data from Pega Platform and other sources to train the model. For more information, see Replacing models in predictions with MLOps or Updating active models in predictions through API with MLOps.

  2. You can validate the candidate model against a data set, and compare the new model with the current model. This analysis provides relevant metrics to help you decide which model has better performance with a static data set.
  3. After you evaluate the models, you can approve or reject the candidate model for deployment to production. You can place an approved model in shadow mode (supported for predictive models) or replace the current model with the new model. For more information, see Evaluating candidate models with MLOps or Updating active models in predictions through API with MLOps.
  4. If you approve a model, the decision strategy that uses the updated prediction is changed, and the strategy as well as the candidate model are added to a new branch.
  5. Depending on how your environment is configured, deployment involves different stages and components:
    • In a Pega Customer Decision Hub environment, the system creates and resolves a change request in Pega 1:1 Operations Manager. A team lead can verify the changes in the rules and the relevant documentation. The change request is packaged into a revision, and a deployment manager can promote the prediction with the candidate model to production. For more information, see Understanding the change request flow.
    • In other environments, in which predictions are used for other purposes, such as customer service and intelligent automation, components such as Pega 1:1 Operations Manager and Revision Manager are not present. A system architect merges the branch with the model update to the application ruleset. A deployment manager can then deploy the application ruleset to production.
  6. If you deploy the candidate model to production in shadow mode, it runs alongside the original model, receives production data, and generates outcomes, but the outcomes do not impact business decisions.
Note: The system sends emails and notifications to the data scientists work group at key stages of the model update process.

The following diagram shows an example of the model update process. In this example, a data scientist approves a predictive model for deployment to production as a shadow of the original model.

Model update workflow
A model update process in an environment with Pega Customer Decision Hub
                        and Pega 1 to 1 Operations Manager, showing the workflow from third-party
                        environment through BOE to production.
  • Active and candidate model type combinations

    Use the Machine Learning Operations (MLOps) feature to add a candidate model to a prediction, either as a shadow of an active model or as the new active model that replaces the previous active model. When planning a model update, review the supported combinations of active and candidate model types to see what changes you can make.

  • Model update statuses and notifications

    As a candidate model passes through the various stages of the model update process, such as validation, review, or shadow mode, its status changes in Prediction Studio. Data scientists and other stakeholders then receive email notifications about the key events in the life cycle of a model update.

  • Active and candidate model comparison charts

    Before you deploy a candidate model in shadow mode or as a replacement of the active model, evaluate the efficiency of the candidate model by adding a validation data set and generating charts that compare the two models in terms of score distribution, ROC curve, gains, and lift.

Did you find this content helpful? YesNo

Have a question? Get answers now.

Visit the Collaboration Center to ask questions, engage in discussions, share ideas, and help others.

Ready to crush complexity?

Experience the benefits of Pega Community when you log in.

We'd prefer it if you saw us at our best. is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us