Posted: 27 Apr 2017 6:07 EDT Last activity: 5 May 2017 5:53 EDT
Upgrading from single schema to Split schema - Single Data base Approach
We have few questions related to this process
We have Single Schema which contains Rules and Data
During upgrade we will migrate all the Rules to New Rules Schema - In this process whether Rule Data will be removed from Original Schema or do we need to remove manually after upgrade using Optimize Schema wizard as mentioned in Pega722_Upgrade Guide?
Is this a dev environment that you can take down while the upgrade is happening? I am just wondering if you need to get the existing environment running again after the schema name change as there will be some manual changes that need to be done if so.
Identifying and moving just the data tables to a new schema in approach one seems like more work than is necessary. I would go with approach 2 of just using a database tool to export the existing schema and then restore to new DataSchema name. Then precede with the upgrade of migrating rules tables, upgrading rules, migrating/generating rules schema objects and links between rules and data schemas, and upgrading the data schema. Then do the cleanup of rules and data schemas afterwards.
As mentioned above we are changing the schema first and then proceeding with upgrade. As part of changing schema first we are generating DDL's from source schema and applying those DDL's on target schema.
In this process we are having issue with while copying Java Classes. Here is the syntax we used to generate DDL for Java Class
I am looking at this guide and in appendix C step 2 I do not see the lines you are referring to about a jar file. It seems like you are trying to move the UDF functions from generatedddl script output. The generatedddl script is for the schema changes only. You can add the UDFs to the new schema as part of the upgrade or by using the generateudf script after the upgrade has been completed.
OK, so I guess the guide is giving you some general guidelines for manually installing the UDF functions. You do not need to have the UDF functions installed in order to proceed with the upgrade. There is a separate generateudf script that you can use to generate/apply the UDF functions. If you need to have DBA install them instead of using the script then the jar file to go along with the function signatures is in the archives/udf directory of the media.