Question

4
Replies
1942
Views
Close popover
Rajendra Prasad Adepalli (RajendraPrasadA3245)
United Health Group

United Health Group
US
RajendraPrasadA3245 Member since 2016 1 post
United Health Group
Posted: March 30, 2017
Last activity: June 26, 2017
Closed
Solved

How to purge huge amount of data from pega work table?

Our Application is planning to purge data which is older than 2 years from 3 work tables and also implement a periodic purge for every week.

No of rows that needs to be purged from individual work tables

1) 54,884,222
2)25,436,271
3)121,947

Approaches:

1) PURGE/ARCHIVE TOOL
Is this approach ideal to purge 70 millions rows? and also I read a article which stated that this is only ideal for applications which have less than 100K and pega default values are 10k and interval between each run is 30 days. How often we can run this activity and also how many records can we purge with single run.

We want know the constraints and also the necessary steps that we need to follow for implementing this approach .

2) Run delete queries with frequent commits from SQL/ORACLE client.

delete queries can be run with commit after deleting certain number of rows. This will ensure that UNDO table space required to hold records in case user rollbacks the transaction will not be overtly utilized.

3)Using PEGA Agent

Creating a standard pega agent to run periodically to execute a delete query but writing any pega process or agent would mean creating threads within Websphere JVM process.DB purging agents will eat into the heapspace and other resources defined for JVM and rest of processes.

Please let us know what is the best approach to delete huge amount of data and also implement the periodic purge

System Administration Data Integration
Moderation Team has archived post,
Close popover This thread is closed to future replies. Content and links will no longer be updated. If you have the same/similar Question, please write a new Question.