How to implement transaction timeout in datasets and dataflows in Pega Marketing
Is there a way to implement timeout in datasets/flows?
Use case I am working on is this: In Pega Marketing 7.13 (on Pega 7.2), customer data object is constructed via CustomerData dataflow. The dataflow is accessing ~20 tables in run-time to compile customer data. Most of the time customer data can be collected in 300-400 ms, which is fine. However, due to database related problems, Oracle is not able to give response to PM's requests. In better words, transaction takes more time than usual. This is causing an issue in real-time interactions. Since PM cannot meet with its (e.g. 1 sec) SLA, channel applicaiton (e.g. Internet Banking) is terminating its request, and propositions are not shown i.e. customer will lose money.
Oracle database administrators are continueasly improving their systems. In order to find a ultimate solution, enterprise architecture team is eager to implement a timeout on Pega side. Expectation is to be able to implement a timeout property for each dataset execution. Thus, if a dataset does not return any value e.g. in 50 msec., dateset will terminate the trasnaction, and dataflow will continue its flow, and related property will be set to null. That would still allow us to make decisions with the fetched data and would not consume/hold resource in the application server's jdbc pools.
Could you please let me know if it is possible to implement this in Pega Marketing?
***Updated by Moderator: Marissa to add Feedback Item details to post***
Thank you for the suggestion. However, "Maximum elapsed time in seconds" attribute is expecting an integer where as timeout we would like to implement is around 50 msec for each data fetch. Minimum value I can enter here is unfortunately 1 second.
Addition to that, I've more than 20 data sets in CustomerData dataflow. Rather than creating RDs, I would like to set it on data set and data flow level. Morever, do we know what exactly happens when "Maximum elapsed time in seconds" attribute exceeds the given number? Does it terminate the thread in application server and/or release the resource in JDBC pool? If JDBC resources in app. server are not released, implementing the timeout would not help us.
Please let me know if you can think of a different way of doing it.
Am I correct to summarize - "get as much data as you can within the SLA and execute the decision on what was fetched" ? If this is the case don't you wanna know when all the data was fetched vs only some data ? This will very soon get complicated.
Or you wanna have a default decision when some data wasn't fetched ? This default decision will not take into account any fetched data?
If a DataFlow cannot compile customer data within defined SLA, decision execution must not start at all. Once SLA is exceeded, data flow execution, and all related threads (e.g. data sets) must stop. And related step in the activity should return a message (e.g. DataFlow couldn't meet with its SLA) and/or give us the option to jump to a callback strategy alike step. That would enable us to return a default message (e.g. NoOffer) to calling application, in other words hide what's happening in the "CRM" system, and log what actually happened.
In OOTB ExecuteWebContainer REST API, there seems to be two activities related with this; ExecuteSingleCaseDF & pxRunSingleCaseDDF. I am not sure if these are the related activities that need to be modified but if so, at this point, being able to stop the execution of these steps, when SLA is exceed, is our expectation.
If a data set cannot fetch data within defined SLA, hereby we're seeking for two options;
Data set execution stops. Data Flow stops. Feature #1 kicks in.
Data set, which couldn't meet its SLA, stops. Data Flow execution continues, hoping that DF will fetch customer data within defined SLA.