I have a requirement where I need to connect to an external database which is used by external systems in the organization and the data is needed on a daily basis to be fed into Pega at non-business hours. We have created the DB connection creating an instance for Data-Admin-DB-Name class. The connection also is successful.
My approach to the requirement was to write an agent that would hit the external DB at 11:00 pm and would populate the corresponding tables in the Pega Schema which can be later referred by Pega.
What is the best approach or practice for the above requirement?
What if the requirement in the future changes to fetch the external data for more than once? In that case, I guess my approach would not be ideal if performance is considered.
Please help with the suggestions and guidance.
Thanks and Regards,
**Moderation Team has archived post**
This post has been archived for educational purposes. Contents and links will no longer be updated. If you have the same/similar question, please write a new post.
Hey Subhajit, if its purely DB related activities, where in we need to get in data refreshed, we can utilise the DB tools such as SQL Loader, Informatica, or stored procedure scheduled, which can do it. Pure lift and shift can be done.
Unless you have any specific business rules required that needs to be applied to the data being loaded, I dont think you would required to intervene using Pega.
The requirement is a bit tricky. They want to keep a future fitment of implementing business rules during data transfer process daily which is not currently the case. Initially we had also recommended something other than Pega to move the data as a batch process.
Based on the requirement and the conditions that they mentioned, it felt necessary to provide the solution in Pega.
Since you need solution in Pega, you can keep a periodic agent running every 20 or 30 minutes. In the first step of agent activity you can validate time (Keep DSS of start time and end time), if the time is as expected (i.e. between start and end value of DSS)continue with activity, else skip it. Also if you need to process stuff quickly you can run agent on different nodes. If different behaviour is required, you can consider some logic / parameters based on Agent node ID.Time control through DSS will be easy to tweak during external system down time or running agent during additional capacity is required.
I also went ahead with a similar approach of configuring an agent and running it based on condition checks but I was wanting to see whether this solution was the best one or there was some other best solution as periodic agent processing with affect performance of the application.
Hi, Have you considered accessing data via Data Pages with source as activity which can get data using RDB methods? You can control the refreshing of Data page and make your architecture close to event driven rather than batch -which I personally dont like