Control how many times batch and real-time data flow runs retry SAVE
operations on records. With automatic retries, when a SAVE operation fails, the run can
still successfully complete if the resources that were initially unavailable become
operational. The run fails only when all the retries are unsuccessful.
can control the global number of retries for SAVE operations through a dedicated dynamic
system setting. If you want to change that setting for an individual batch or real-time data
flow run, update a property in the integrated API.Note: If a single record fails for Merge
and Compose shapes, the entire batch run fails.
Retries trigger lifecycle events.
For more information, see Event details in data flow runs
What to do next: If you want to change that setting for a single batch data
flow run, update the pyResilience.pyShapeMaxRetries property in the
RunOptions page for the run through the integrated API. For more information, see Pega APIs and services.
- In the navigation pane of Dev Studio, click .
- In the list of instances, search for and open the
dataflow/shape/maxRetries dynamic system
- In the dynamic system setting editing tab, in the Value
field, enter the number of retries that you want to run when a SAVE operation on
a record fails during a data flow run.
The default value is 5.