Best Practice to load 70K to 100K records for daily night batch process
I have a requirement to load 70K to 100k records daily night from an external data table and create cases based on initial filter conditions and we will be creating cases for the majority of the records.
Once the cases are created each case will fetch multiple data elements from multiple external data tables.
Please let me know the best practices to load such huge data on every day night batch file. It will be helpful if you can share your previous experiences in working on such tasks.
You can try using multiple batch processes running on a dedicated node. Configure one batch process( call as master batch process) to allocate and queue the work to other batch processes based on a particular attribute like number of records for each batch processes to be allocated.
You should have a dedicated node for these batch processes and users should not login to this node.
Below are also few points regarding load testing which might be of help to you.