@vamsikrishnaM Could you please elaborate a bit more on this ? Currently I have a dataflow where destination is the Kafka data set. Are you suggesting creating a separate rule ? Would that require Pega to have consumer privileges as well to Kafka?
@ReshmiN8 Yes if you want to read the message from the topic you should be having the read access configured at kafka end. if you want to just read the messages standalone just for debugging purpose you can run the data set from the actions menu in the data set rule form. If you want to do this programatically then you can write an activity and use dataset-execute method to read the message and store in a pagelist.
@vamsikrishnaM Thanks for your response, yes both options you have mentioned will work and have been considered. However imagine a batch scenario where you are sending large volumes of data and you have a separate activity that again reads the data just to store the offset and partition values.
From a producer's perspective wouldn't that be an overhead performance wise ? Was trying to check if there is any other option which would be easier. But looks like this is the only way forward.