In cloud environments, we have a constant need to provision new nodes to promote new patches, config changes, etc. As we wanted to automate the process of provisioning the new nodes, we are struck with Stream node decommissioning where it is a manual process.
Having a API to decommission the stream nodes will be able to automate the entire process
***Edited by Moderator Marissa to update Platform Capability tags****
Thank you for sharing your idea here in the Pega Collaboration Center (PCC)!
I have submitted this idea on your behalf in our internal system for feature enhancements and updated your post with the associated FDBK-ID. You can take this ID to your Account Executive for next steps.
@Kayla_M : Thank you raising enhancement request. This will help to overcome this limitation of Pega Cloud
Pega Cloud currently supports the Kafka server either as an internally managed by Pega or provides it to Pega Platform as a managed service. In both cases, this Kafka data is not copied during the cloning operation; instead, Kafka topics created by Stream datasets are re-created in the cloned environment on first use.