Question

9
Replies
1252
Views
Close popover
Nikhil Bhandari (NIKUBHAN)
PayPal
Pega Architect
PayPal
IN
NIKUBHAN Member since 2009 15 posts
PayPal
Posted: December 17, 2018
Last activity: October 28, 2019
Closed
Solved

Error in Kafka Save Operation: Caught exception: com.pega.dsm.dnode.api.BatchRecordException

I am doing a POC to connect to Kafka from my Pega 7.3.1 application.

Created the Data Set & the Kafka configuration instance & successfully did a Test Connection form both.

Created an activity & used the DataSet-Execute method to perform Save Operation in an existing Kafa queue. Checked the checkbox (Save List of Pages defined in a Named Page) & provided the Code-Pega-List page name (with one record in pxResults(1) with a few properties)

On executing the activity I am getting the below error stack trace:

Caught exception: com.pega.dsm.dnode.api.BatchRecordException
com.pega.dsm.dnode.api.BatchRecordException$Builder.build(BatchRecordException.java:66)
com.pega.dsm.dnode.impl.dataset.kafka.KafkaSaveOperation$2.emit(KafkaSaveOperation.java:198)
com.pega.dsm.dnode.impl.stream.DataObservableImpl$SafeDataSubscriber.subscribe(DataObservableImpl.java:335)
com.pega.dsm.dnode.impl.stream.DataObservableImpl.subscribe(DataObservableImpl.java:53)
com.pega.dsm.dnode.impl.stream.DataObservableImpl.await(DataObservableImpl.java:99)
com.pega.dsm.dnode.impl.stream.DataObservableImpl.await(DataObservableImpl.java:88)
com.pegarules.generated.activity.sh_action_kafkaconnector_5a47491e6a6d65ed69fa439626fbfd98.step6_circum0(sh_action_kafkaconnector_5a47491e6a6d65ed69fa439626fbfd98.java:455)
com.pegarules.generated.activity.sh_action_kafkaconnector_5a47491e6a6d65ed69fa439626fbfd98.perform(sh_action_kafkaconnector_5a47491e6a6d65ed69fa439626fbfd98.java:165)

***Edited by Moderator Marissa to update SR Details***

Data Integration Support Case Parallel
Moderation Team has archived post,
Close popover This thread is closed to future replies. Content and links will no longer be updated. If you have the same/similar Question, please write a new Question.