Store all test cases, test suites and any test data for your application in a dedicated test ruleset
This would ensure to maintain test assets separately and provide the option to manage them independent to the application rulesets.
Avoid moving test assets to production
As test cases are designed to generate dummy data, dummy objects, it is always good to avoid running test cases in production. Production data is sensitive and we should avoid our test cases tampering that data.
Configure to ignore test rulesets for guardrail score calculation
This would ensure that guardrail score of your application doesn’t take test rules into picture
Go to Application -> Quality -> Settings and check “Ignore test rulesets when calculating guardrail score”
Test case name & description should be relevant and explains the purpose of test case
This would make it easy to understand the purpose of test case without getting it opened
Follow certain naming convention for your test cases
This would help in easy filtering of test cases for execution/modification during regression
Comments must be used for validations of your test case wherever required
This is essential to make the maintenance of the validations easier
Keep each test independent using the setup/clean-up options of the test case
This would ensure that the test case can be executed on any environment without doing any changes to the test data
Rule- instances created by tests should be cleaned explicitly
These are not taken care of by automatic clean-up option
When a cleanup option is disabled, take care of explicitly removing the data pages loaded as part of the test
Ex: Automatic clean up fails in cases when there are dependent items like a child should be cleaned and then the parent. We should disable the cleanup option in test, write clean up to handle the case and also should remove Data pages
Avoid modifying Dynamic System Settings and other settings in prcofig file
These affect the entire system and can be shared across the test systems
Avoid updating common pages like system/global pages as part of the test case. If you have to modify them for some purpose, make sure to take a back-up using “Setup” steps and restore them using “Clean-up” steps
These pages are shared across various systems, tests may produce invalid results.
Automatic back-up and restore of some commonly used system pages (Operator, AccessGroup, pxRequestor and application) is taken care by the framework.
Even if you take back-up and restore such pages back, in case of parallel execution, you may expect invalid results.
Write tests and configure genuine validations to ensure proper test coverage. Avoid dummy validations to meet compliance requirements must be avoided.
Consider all positive/negative cases, boundary values, multiple paths.
Examples for invalid assertions:
If a test has property assertion and asserting the only pxObjClass, don't add any value
Only validating Activity status but nothing else
Tests must be simple and validate specific things. Don’t create complex tests which validate a lot of things. Instead create multiple tests.
This will make it easy to maintain the tess
Prioritize your tests for automating them
Below tests should be given more priority for automation
Functionality which has predictable results
Functionality which doesn’t change frequently
Tests which need to be executed frequently
***Edited by Moderator: Pooja Gadige to add Developer Knowledge Share tag***