What are the Pega requirements for an interaction history row of data be saved as model last outcomes?
I wonder if anybody can shed some light on this issue.
The problem is quite simple to describe: load an original interaction history impression row; change its pyOutcome to accepted; write it again to interaction history, AND to last model responses. Do it all that via the running of a interaction rule; tick the checkboxes on the interaction rule to write to: 1. clipboard; 2. interaction history and 3. VBD.
The row gets written to 2. Interaction history, but it does not get written or seen on the Designer Studio > Decisioning > Predictive Analytics > Latest responses.
The funny thing is that the original interaction history impression row is right there on the "Latest responses", and it was saved by a similar interaction rule, but it was mocked via a data transform.
I realize that if I do not clear the pxFactID from the loaded original row it will not save in 2. Interaction history; but it seems I am then missing some property value clearing or property value setting for it to be saved to the latest responses; or the latest responses is not working (I doubt it).
Hence the question: what is the Pega requirement for an SR row to be saved to the latest responses? What must it have set? What must it not have set? Has anybody ran a similar prototype that could shed some light here?
I found a recurring pattern in this problem/issue that maybe points out to the cause of why some responses are not being written to the "adaptive model last responses" with Accept/Reject.
The responses that are not written to the "adaptive model last responses" are always responses generated for original impressions that came from a particular interaction rule. If the original interaction/impression was generated by that interaction rule, then - no matter what - the response will not be stored. It is not stored even if I respond using data flows that worked before for responses, and it certainly does not work with interaction rules. Both methods of response had been tested to work before! So it points to the original rows/interactions/offers.
I looked at the rows and they do not seem to have anything wrong about them. I highlight them below on the attachment (the original rows that cannot be responded to). Also I notice that there was well an adaptive model running when these impressions got generated.
Actually I notice these rows are for the web channel. A channel which has not yet been registered in Pega. Could it be that adaptive models are picky about this? Let me investigate further. Any hints are welcome as again if I knew the requirements I could run a checklist/troubleshooting.
As most people don't have Mesh accounts, do you mind sharing the corret answer here and then we can mark your reply as such? This will help those that have the same question in the future, who don't have access to the Mesh.
As of 2017 Dec the recommendation is to use dataflows for everything (in the context of Pega decisioning). Thus in- and outbound. The answer from Iolanda points to examples of these integrations using dataflows, and to how to save data as "model last outcomes":
Answer from Iolanda that was the best answer:
The Implementing Adaptive Decisioning Pega Academy course focuses on adaptive learning in batch and real-time using data flows. The decisioning reference application, DMSample, also has a number of examples, mostly real-time; you can explore them by adding DMSample:Administrators to your operator's access group list and switch to the DMSample application. I can provide more pointers if necessary, but to start with: 1) look at the Recommendation and Results stage of the Top Offers case type, where the process flows uses activities to trigger data flows, and 2) the headless decisioning example (artifacts under DMOrg-DMSample-Int-Decision and DMOrg-DMSample-Int-Response, the example explained in this PDN article).
What is exactly your concern about data flows running asynchronously? There may be indeed cases where you need to check the status of the data flow (chain processing), but in general that will not be necessary. Where you need to put some thought in terms of designing your solution is the decision/response lifecycle: for how long do you need to store the responses, the origin of the response data, etc.
Regards, Andre' Cesta (text above from Iolanda da Cost Martins).