We have reference data with 50K+ rows and need to compare only one row with primary key match at one time. Currently we use report definition to fetch the rows and a datapage at node level to read the row. Is this an efficient way to do this? Would an activity with obj-browse be better? Is it ok to keep 50k+ records in memory? Also, how to use the parameter from data page to report definition to fetch only one row?
***Edited by Moderator Marissa to update platform capability tags****
So, to confirm, you think its not a issue with having such a large number of records in memory?
Because one thought is, this might have an impact on performance. Why do you think this should be ok? Please let me know.
Secondly, the reference table stores a list of all location details with locationcode as PK. We are getting this locationCode in the incoming json request that we can pass as parameter. Hope that helps you with your question.
When you say memory you mean "Clipboard" , if yes then thats not the preferred way to load 50k records at once.
Wherever you have to use Location details you can refer you data page as D_LocationDetails[LocationCode: <Value>] and in the report definition have location code as parameter and is "Equals" in comparison.