We need to achcieve the same using Pega 7.1.5 version and scenario is we are using csv file having multiple rows in thousand and for each row we need to create the work object. As there is large data and this might get load on systen so how should we handle that. Is there any option given within file listener or we need to handle it within the listener activity.
Yes. It is definitely possible to do this in Pega. You can do this using multiple batch processes running on a dedicated node. Configure one batch process( call as master batch process) to allocate and queue the work to other batch processes based on a particular attribute like number of records for each batch processes to be allocated.
Once records are allocated to the other batch processes, they will pick from the queue and process all validations for each record. The number of batch processes can be based on number of records and time to complete the processing of records.
You should have a dedicated node for these batch processes and users should not login to this node.
Please let me know the requirements in detail if you need more inputs on design.
The ideal way would be to break the single large transaction to different transactions. Have the Filelistener Service parse the request from the CSV and store the parsed details in to a staging table by creating a data object and schedule a standard agent to pickup this object that contains the details of the request and create your application specific workobject. Configure this application agent to run every 30 secs or 1 min or according to your requirements. This way you would not have to worry about the locking,queuing mechanisms as the standard agent would take care of these issues.
How do we exactly parse all set of records at a time to db table? All i could find in several articles is to use 'record at a time' in service file, however this operation seem to be costly considering the large number (ex: 100000) of records in csv file.
Assuming that the data you are trying to read by reading record at a time is being mapped to some predefined properties in your application, You can call an activity from file listener which will do an obj-save operation for the data that you read from the record.
This would still use record at a time right! consider we have 1 lakh records, and we are using record at a time and then calling activity on such one record, performing operations on that whether obj-save or anything .. it has to do the same for 1 lakh which seem to be costly !!
All I wanted to know is whether we can use a single parse rule to get all records into a page list and then I would think of updating those in some better way..