Hi Folks, I have been working on a requirement where I have to process lot of records say PriceList and the limit can go up to 25K records. Now after developing the solution when I am doing my unit testing to check the threshold of the code, Initially I observed that I am not able to see few records in the PriceList.pxResults() after certain records and when looked at the pxResults I realized that it is not holding records after reaching 10K count.
Is there any way to accommodate all the results in the page list without exhausting or altering the pxResults? I am OK to change the design as well. Right now, I am getting the data from an external database via RDB query (obviously not in a one go) and appending them in a clipboard page (PriceList.pxResults) and I must show these PriceList records on the UI for further processing.
to answer your question - No, I am using pagination to show records in multiple pages, but the records which I display on the UI, that is based off the offers I have selected on a previous page and based on those offers only I am pulling an preparing the price list pagelist to show the records.
Yes, I have a functionality where user may duplicate an individual row on the UI or if they want they may download the data in a excel and edit it offline and again come back and upload the data for further processing and ultimately I am using that data to store in S3 bucket for further uses.
Posted: 1 year ago
Posted: 12 Jul 2020 1:09 EDT
John Paul Raja Christu Raja (JohnPaulRaja,C)
Thanks John. This is informative and I am aware this feature in Report Definition but for my requirement I am not using RD and making use of Obj-Browse.
Actually it was a misunderstanding from my side I didn't do a proper research. Obj-Browse method by default pull max of 10K records only when you don't specify any MaxRecords and this can be overridden by specifying a number or if you will put as '0', it will pull all the qualifying record. So in my case specifying MaxRecords param value as '0' resolved the issue.
@RaghaV I had a similar requirement where end users supposed to download almost 50K records in an excel. To process it in real time was a hit on performance. So we went ahead with the Job scheduler and whenever we identified that the record count is more than 5K, we processed it through job scheduler and pushed to end user in form of an email attachment.