Pranoop Member since 2019 3 posts
Posted: 8 months ago
Last activity: 8 months 4 weeks ago

Archival And Deletion of Large Data from External Database

Hi Everyone,

We have a requirement where we need to copy some 2 months back old data into archival table from the main table, and then delete it from the main table. This should be done every week, which we can do through agent. But my problem over here is, one week data almost has 1-2 million records. The same archival should happen for 4 different kinds (one for link attachment, one for data-workattach and 2 more tables). So, my questions are:

I did try just to write an SQL Query, but some say that PEGA cannot handle that large data and it may lead to application slow down.

1) Can PEGA handle these many records at a time? If so, is it advisable to do so??

2) If not the above way, how can we achieve it through PEGA?

Data Integration Java and Activities Data Integration System Administration
Share this page LinkedIn