I am currently optimizing a DB query which takes forever. The objective is to retrieve 5.000 to 15.000 Rows of the database via Obj-Browse in an activity. By measuring performance for 6.000 rows I noticed that the Obj-Browse is not taking the most time (~5-10 seconds), but the following "Append and map to" step which copies the retrieved data from the result page into the workobject (20-30 minutes!). The performance is not scaling linearly which means processing only 1.500 rows can easily be processed in 1,5 minutes.
My next idea would be "chunking" the data so the result page won't be as big as the target page list of the workobject. I want to to Obj-Browse for only 1.000 or 1.500 rows and append the result page to my workobject and repeat this until I processed all rows. As a parameter in the activity I can specify a max. records for the Obj-Browse step - but how do I proceed with the next iteration? How do I use pxNextKey to start browsing the next time with the 1.001th row?
Thanks in advance and best wishes!
***Updated by moderator: Marissa to close post***
This post has been archived for educational purposes. Contents and links will no longer be updated. If you have the same/similar question, please write a new post.
>>> but the following "Append and map to" step which copies the retrieved data from the result page into the workobject (20-30 minutes!).
I'm curious what's going on in this append-and-map step. Is it opening fifteen thousand blobs ? Or hitting a cascading set of declaratives ? 20 minutes sounds like too much. Can you turn on ALL event types and rulesets in your tracer settings and trace about ten thousand tracer lines worth of this step, click "save" in the tracer, zip the result (important to make it MUCH smaller) and post the zip, assuming the result is under a meg or so ? /Eric
Further to Eric's request - can you also take a look at the ALERTs for the time period that you ran your test - and see if there are any clues in there as to what is taking the time.....If you are able to post the ALERT file here - please do so - and we can also take a look.
While tracing I see a lot of BLOB reads which is expected and after this there is an alert Package DB Results: Packaging of database query took more than threshold of 500 ms
Does this packaging regard to the database (query itself) or the processing of the answer of the database?
After the Obj-Browse (and its reading BLOB) the Data Transform which only does an Append and map to is executed:
Between Begin and End there are no Declare Expressions etc (activated all events and all RuleSets in Tracer options). I traced this with only 2250 rows, because I didn't want to crash the tracer. Maybe I can try this 38 minute (without tracer!) operation with the tracer.