Sivajyothi Member since 2015 11 posts
HSBC Software Solution
Posted: 1 year ago
Last activity: 1 year 2 months ago

Kafka Data Stream Data Flow Consumes duplicate data when real-time processing dataflow restarts

Hi ,

We are consuming data from kafka topic. Realtime dataflow is configure where kafka stream data set is configured.

While testing i could see when i stop the realtime dataflow and restart it,its processing all the records which its processed before stop.


kafka topic has 80 records and real time dataflow read all 80 records(for testing logging the data) and now Data flow is stopped.

When i restart the data flow, again it is reading the 80 records which was already processed.

Is there any setup/configuration that should needs to be done at pega so that it will not read data again.


Pega Platform Low-Code App Development Data Integration Decision Management
Moderation Team has archived post
Share this page LinkedIn