Currently we use Bix utility to send data to our data lake in a batch mode since bix is not real time.(this is not the optimal solution as Golden Gate can be used for real time replication ,but since the blob is obfuscated and only bix can read it, we have a limited choice)Looking to use Kafka for similar use cases
1>Stream data out of production database into the data lake -big data-Hadoop.
Can we use KAFKA for this use case assuming we move to 7.3 which is scheduled for next year or so.
If so at what point-in-time do the Kafka Data Sets get created ?-does it get created before Pega engine gives control to Oracle to write the data to db???-
I haven't understood the context of creating Kafka data Sets as explained below-it does not explain any use case-but gives you steps to create Kafka Data sets
What if you want to stream data out of the database using Kafka -is this possible? considering by the time the data is in the db -its in the blob and obfuscated. In this scenario how could we push data from database into Kafka topics-why does Pega obfuscates the data ?
Thanks in Advance and appreciate your time -hope to get answers that I'm looking for