Question

3
Replies
77
Views
Pranoop Mutha (Pranoop)
Infosys

Infosys
US
Pranoop Member since 2019 3 posts
Infosys
Posted: December 30, 2019
Last activity: December 31, 2019
Posted: 30 Dec 2019 16:18 EST
Last activity: 31 Dec 2019 10:11 EST
Closed

Archival And Deletion of Large Data from External Database

Hi Everyone,

We have a requirement where we need to copy some 2 months back old data into archival table from the main table, and then delete it from the main table. This should be done every week, which we can do through agent. But my problem over here is, one week data almost has 1-2 million records. The same archival should happen for 4 different kinds (one for link attachment, one for data-workattach and 2 more tables). So, my questions are:

I did try just to write an SQL Query, but some say that PEGA cannot handle that large data and it may lead to application slow down.

1) Can PEGA handle these many records at a time? If so, is it advisable to do so??

2) If not the above way, how can we achieve it through PEGA?

Data Integration Java and Activities System Administration
Moderation Team has archived post, This thread is closed to future replies. Content and links will no longer be updated. If you have the same/similar Question, please write a new Question.