With Pega Deployment Manager 02.01.03 we are delivering package of 2 products.
Rule jar ( list of all ruleset) and Data Instance (All Data instances for DB, DB Table, Class etc).
1 - When we import Rule Jar, if there are any aged updates then the pipeline fails with status, pyStatusMessage: "There are aged updates in the archive. Exiting the import process. List of aged updates" and the pipeline fails.
the alternative is we save the list of aged rules in Development Environment and then re-import using the same utility tool. This way Import was successful. Is there any alternative to handle this?
2 - When we import Data Instance, we get a screen with checkbox for Automatic and Manual. We import via Manual skipping the auto update of Schema, but Deployment Manager takes the automatic by default and sends the status value as failed. is there any alternative to handle this?
the DSS value for "AutoDBSchemaChanges" is enabled.
,"pySchemaSQL":"--execute on the database specified by Data-Admin-DB-Name PegaRULES:\nALTER TABLE MCOM_SCHEMA.pc_BGC_BE_IDSync_MCOM_Work ADD \"MYFIRSTNAME\" VARCHAR2 (50)\n;\n\n--execute on the database specified by Data-Admin-DB-Name PegaRULES:\n
--execute on the database specified by Data-Admin-DB-Name PegaRULES,"
If you are getting aged updated, I would recommend trying to avoid it, avoid packaging operators, or data instances or configuration that are updated per environment. This is often a sign that too much environment specific information is being packaged in with the application.
This approach is feasible only if we handle single release, but in case of multiple releases handled/deployed from same development environment, then it's a challenge.
AutoDBSchemaChange was set to false in our Orchestration server, but that didn't help either, where as PEGA says, Schema changes need to be applied before this build on Production stage..Once you are done, please complete “Schema Changes” task on to Production stage to progress this build further.
Few of the SQL Changes which is coming as a blocker, if we don't wish to deploy, but due to feature or design of PEGA Deployment Manager, the pipeline will never succeed until we deploy it manually.
Please suggest if you think there is an alternative.
I am a little unclear about what the exact issue is. Are you trying to avoid having the schema be applied automatically? If so by following the instructions in Deployment Manager to disable automatic schema application, but turning off AutoDBSchemaChange, the pipeline should simply pause waiting for someone to manually apply the schema and then you can resume the step. It behaves like a manual step.
If this is not what you are looking for it would be helpful to clarify your use case.
You can choose to ignore applying the schema change and just approve the manual step. Just make sure that you application won't fail if the schema isn't applied, or that it is being taken care off in some other way.
One other thing to double check. You should also check the status of AutoDBSchemaChange DSS on the candidate system, make sure that it is set to false on the production environment as well.