Question

29
Views
AishwaryaBadhe Member since 2016 3 posts
John Deere
Posted: May 28, 2020
Last activity: June 11, 2020

Issue in fetching 100M data from DB2 using data set

We are trying to fetch data from on-premise DB2 database table containing more than 100M records and storing it in form of .CSV on AWS S3, as our Pega is on AWS. We are running data flow containing data set to fetch DB2 records and other data set to post this data to S3. Data flow is running for an hour and giving us below error. Can someone please help to resolve this issue.

Error on Screen - com.pega.dsm.dnode.impl.dataflow.resilience.policy.MaxFailedRecordsPerRunPolicy$ErrorThresholdReachedException: The run failed, because it exceeds the maximum number of failed record, which is currently set to 0

 

Error in Log-

Caused by: com.pega.pegarules.pub.database.DatabaseException: Unable to query the database: code: -905 SQLState: 57014 Message: DB2 SQL Error: SQLCODE=-905, SQLSTATE=57014, SQLERRMC=ASUTIME;000000000041;000002000000;SYSIBM.DSNRLST01, DRIVER=4.19.26 

DatabaseException caused by prior exception: com.ibm.db2.jcc.am.SqlException: DB2 SQL Error: SQLCODE=-905, SQLSTATE=57014, SQLERRMC=ASUTIME;000000000041;000002000000;SYSIBM.DSNRLST01, DRIVER=4.19.26 

| SQL Code: -905 | SQL State: 57014 

 

***Edited by Moderator Marissa to update Platform Capability tags****

Pega Platform 8.3.1 Data Integration
Share this page LinkedIn