Naveen Kumar Gatupally (Naveen Naanu)
MTS -III Sys Engr
Naveen Naanu Member since 2019 3 posts
Posted: April 21, 2019
Last activity: April 22, 2019
Posted: 21 Apr 2019 21:30 EDT
Last activity: 22 Apr 2019 8:50 EDT

Data flow pause/resume feature

We have a requirement to process 7L+ of records from external DB. Considering the performance, we have chosen the Dataflow - Dataset to read the external DB records with thread count as 20 records per sec with multinode batch processing. the feature is absolutely working fine.

Since there are no filter options in a dataset like RD, if the Agent in data flow has read 3Lakhs record and due to natural calamity the system got down for a period of time and when it gets back to the stable state, the data set in the dataflow is reading the records from 1. However it already read and processed 3 Lakhs of record, our expectations are to read from 3Lakh+ record.

Is there any way to resolve this issue? We are using Pega 7.3 version.

Pega Platform Data Integration Decision Management
Moderation Team has archived post, This thread is closed to future replies. Content and links will no longer be updated. If you have the same/similar Question, please write a new Question.