We have a requirement to store all data from PEGA DB to external DB. To store most of the data's comes under pyWork includes both optimized and Non-optimized properties based on the case older than 30 days. Tables are already created based on the PEGA screens with related columns in external database. So I wanted to know the best approach to backup the data's from PEGA to external DB using agent.
Below are different approaches that you can look into:
1. You have the option to create a BIX extract with filter criteria (select work object older than 30 days). In this BIX extract you can give the target as external DB. This will move the data from your Pega DB to target DB.
2. To back up the data in file system pega provides the feature of Purge and archive which creates a zip file with all the data of workpool selected in the file system. The only limitation of this feature is that it only purges the work object with status "Resolved-*".
3. You can follow the below approach from DB side:
- Create a different table and insert the data from the source table with your filter condition (older than 30 days).
- Now create a dump of this table.
- Using this dump to restore the data in your external DB.
If it's only about backing up the whole PEGA database, then this operation should be performed at DB level, rather than PRPC be responsible for it. This will help to avoid additional processing that PRPC server has to do to backup data.
I think you need to work with your database administrator to set up database replication, which uses a mirroring technique.