I have set up four data flows that need to be run in a specific order, and started only when the previous one has ended.
I found no problem in setting up the activity to Monitor, Start and Stop Data Flows, but when I run it (on Batch nodes) I see that the first two start immediately, while the third and the fourth are queued.
Is there a best practice method to have them executed as described above?
I have already thought of some work-around, but I would like to go with a more robust and tested solution.
You can use a combination of Pega's case / process flow's and it's utility shapes, filter, SLA wait queue items to create your own synchronous data flow execution.
I guess you might have utilized the API's under the Data-Decision-DDF-RunOptions to start and stop your flow. In addition to that we have helper API's (activities) under the Data-Decision-DDF-Progress page. The one in particular use could be pxLoadProgress to get the current state of the dataflow you started before triggering the next one. See the referencing rule of those activities for sample implementations.
So in summary:
1. Define a Pega work object for the end to end execution orchestration.
2. Use utility shapes to start your dataflows (with methods which you might already familiar under the RunOptions class mentioned above)
3. Check for status of the kicked-off DF using the pxLoadProgress API, if not yet completed go to your SLA and check. You can also alternatively use your post activity in a dataflow to do continue your flow action in the Pega work flow to start of the next in the sequence. This would make sure you don't have to wait for the previous dataflow.
4. The work objects gives you a better picture of where you are in the execution.