Screenshot generation runs on local client
Valid from Pega Version 7.1.8
If you generate screenshots to include in your application document, this process still runs on your local client. Additionally, you are still required to use Internet Explorer to use this feature.
Warning message when opening server-generated documents
Valid from Pega Version 7.1.8
Depending on your system configuration, the following warning message might be displayed when you open a document that was generated on the server.
If you click
, certain content (for example, Table of Contents) is not populated in the document. Instead, click and save the document.The warning message that displays
Decision Data Store data sets can be used only on DNodes
Valid from Pega Version 7.1.8
Data flows that contain Decision Data Store data sets as the primary or secondary source must be created and executed only on DNodes.
Data flows are not restarted automatically after application server restart
Valid from Pega Version 7.1.8
When you restart the application server or the Pega 7 server, you stop the execution of your data flows. The interrupted batch and real-time runs are marked as Failed.
Recommendation:
- Go to the > > > > tab and start the failed batch runs.
-
Go to the
> > > > tab and activate the failed real-time runs.
Value list and value group properties are not supported inside data flows
Valid from Pega Version 7.1.8
Value list and value group properties are not supported inside data flows, and you need to instead use other property types.
See the Pega 7 developer help to check all available property types.
Data flow preview size is fixed to 10
Valid from Pega Version 7.1.8
The Preview option for each shape in data flows returns the first 10 records. This value is fixed and currently cannot be changed.
Data Flow transformation shapes cannot be used in combination with the Compose or Merge shapes
Valid from Pega Version 7.1.8
When you reference another data flow from a data flow that contains the Compose or shape, the referenced data flow cannot contain transformation shapes ( , , ).
Data flow validation does not currently prevent you from designing a data flow that goes against this design pattern. Make sure that your data flows follow this design pattern by checking the referenced and referencing data flows.
Data flow destination is resolved at assembly time
Valid from Pega Version 7.1.8
Data flow execution always uses the design-time destination data sets or destination data flows, regardless of the input records at run time.
Example:
You create a data flow and set the destination to a specific data set. In the data flow, you use a strategy that produces some results. Regardless of these results, the execution of your data flow always uses the destination data set that you specified when you designed the data flow.
The same applies when you set the destination to a data flow.
Extension attributes are not supported in PMML models
Valid from Pega Version 7.3.1
Models in the Predictive Model Markup Language (PMML) format version 4.3 that contain extension attributes with the x-
prefix are not valid. These extension attributes are deprecated; you must use extension elements instead. In addition, if the output type of any output field in the model is set to FLOAT
, change it to DOUBLE
.
For more information, see PMML 4.3 - General Structure in the Data Mining Group documentation.
The Upload responses action is not supported for adaptive models with customized context
Valid from Pega Version 7.3.1
A default instance of the Adaptive Model rule contains five model identifiers (.pyIssue, .pyGroup, .pyName, .pyDirection, .pyChannel) that are used to partition adaptive models. If you add other identifiers in your Adaptive Model rule instance, you cannot upload responses to this instance with the Upload Responses wizard and the following error is displayed: The Flow Action post-processing activity pzUploadCSVFile failed: Cannot parse csv file.
You can still train such adaptive models with data flows.
For more information, see Training adaptive models in bulk with data flows, Model context, and Uploading customer responses.