Question

2
Replies
406
Views
Siva Jyothi PamiReddy (Sivajyothi)
DHL

DHL
IN
Sivajyothi Member since 2015 12 posts
DHL
Posted: June 25, 2019
Last activity: June 25, 2019
Posted: 25 Jun 2019 8:05 EDT
Last activity: 25 Jun 2019 23:58 EDT
Closed

Kafka Data Stream Data Flow Consumes duplicate data when real-time processing dataflow restarts

Hi ,

We are consuming data from kafka topic. Realtime dataflow is configure where kafka stream data set is configured.

While testing i could see when i stop the realtime dataflow and restart it,its processing all the records which its processed before stop.

EX:

kafka topic has 80 records and real time dataflow read all 80 records(for testing logging the data) and now Data flow is stopped.

When i restart the data flow, again it is reading the 80 records which was already processed.

Is there any setup/configuration that should needs to be done at pega so that it will not read data again.

Thanks.

Pega Platform Low-Code App Development Data Integration Decision Management
Moderation Team has archived post, This thread is closed to future replies. Content and links will no longer be updated. If you have the same/similar Question, please write a new Question.