Defining Pega Scenario Tests

From PegaWiki
Defining Pega Scenario Tests /
Revision as of 21:48, 26 August 2020 by BEAUM (talk | contribs) (added design pattern document)

(diff) ← Older revision | Approved revision (diff) | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search


Curator Assigned
Request to Publish Yes
Description
Version as of
Application
Capability/Industry Area

↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓ Please Read Below ↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓

Enter your content below. Use the basic wiki template that is provided to organize your content. After making your edits, add a summary comment that briefly describes your work, and then click "SAVE". To edit your content later, select the page from your "Watchlist" summary. If you can not find your article, search the design pattern title.

When your content is ready for publishing, next to the "Request to Publish" field above, type "Yes". A Curator then reviews and publishes the content, which might take up to 48 hours.

↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓ The above text will be removed prior to being published ↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓

Prerequisites:[edit]

  1. Make sure you have the privilege ‘pxScenarioTestAutomation’ a part of your access roles
  2. The DSS setting ‘pzPegaSUT’ set to true
  3. For Scenario tests creation, an unlocked ruleset to be available in the Application stack/branch. This ruleset is used to save the test cases recorded.
  4. In order to display the  scenario test results in Application Quality Dashboard, the test ruleset to be available in the Application stack as defined in the quality settings
  5. While saving the test cases in a ruleset make sure that rule set is marked as a test ruleset. To do this, toggle-on, the Test Automation settings under security tab of the created ruleset.
  6. Along with this please make sure that all your UI components have unique data-test-ids generated. If you have created an application in the latest version of Pega using OOTB components, this should get generated by default.

Best Practices:[edit]

  1. Scenario tests can only be captured in the context of Application portal only. Scenario tests cannot be recorded from Dev studio, App Studio or any of the development portals.
  2. Suggest to have a Test Application built on top of your actual Application. Add all the test rulesets which have your Scenario tests, Pega Units and related test data to this Test Application. For more details refer here.
  3. Start the recording slowly. For next step to record, for example a refresh page/section, wait till the page/section is refreshed completely to record the next step.
  4. Multiple clicks at the same time: if any refresh activity is associated with a click, then user must wait for that, to be refreshed and then do the next click action
  5. Users can add explicit assertions as well by clicking on the “+” within an orange highlighter. Once the user clicks on “+” icon an overlay with all the possible assertions will be displayed and can be selected from drop downs.
  6. Create Scenario test suites as part of the purpose-specific tests, such as smoke tests suite, regression tests suite etc.
  7. User can disable individual scenario tests for an application so that they are not executed when the test suite runs.
  8. Data-test-ids should be unique: If any copy-paste is carried out for some section, make sure the data-test-ids is generated again for that new created section, to be unique.
  9. Capture the customized UI controls by registering a specific control as a custom component to Pega-UI-Automation-Extension
  10. Supported elements are highlighted during the recording. If any element is not highlighted, it would mean that element is not supported and hence would not be recorded. Non-supported OOTB controls includes ‘Multiselect’ and ‘Timeline view’.
  11. Overridden properties: For existing OOTB controls which are re-saved to do some additional actions but have same name, it would highlight them while recording but would not work. So, user must register them as a custom component to Pega-UI-Automation-Extension
  12. Debugging Scenario tests:
    • Debug mode on and execute: you can even edit during debugging to test better
    • Apart from UI steps, review the runtime console logs that mentions clearly on the steps and details which passed or failed.

Do’s:[edit]

  1. Wait till the step gets updated in the right recording panel.
  2. If something goes wrong while recording, then cancel and re-record steps.
  3. Close the work item tab in interaction portal after done with the test recording or record the closing of the tab as part of the step. Reset the portal by closing interaction or the case items created.
  4. Login and create case types manually before executing tests, so that pages are cached and render fast. This needs to be done if the server cache has been cleared and the server is restarted.
  5. Collapse right panel while recording, if any element which needs to be recorded is behind the panel. After recording expand the right panel again so that steps getting recorded can be seen.
  6. If there is an element that on hover shows there is no unique selector, then it means that the data-test-id for that element is not unique. Generating the Test Id again, should make that element recordable.

Don’ts:[edit]

  1. Don’t rush through elements while recording elements involving ajax calls like cascading drop downs, refresh actions, etc. Just wait for the UI to get updated and right panel to get refreshed with the step.
  2. Do not use auto fill to enter data in forms.
  3. Do not update data-test-ids, for any element in an existing section, it would fail the existing Scenario tests. In case, if required to update the data-test-id, you need to recreate/update the test case again.

Limitations:[edit]

  1. Scenario testing can only be run in the requester context. i.e., a user has to login to Pega and run the scenario tests.
  2. Unlike selenium, Scenario tests cannot be run using different personas or logins. As we are tightly coupled with the requester, once the user logs out, the scenario testing will end, and we will not be able to continue running the same test with a different operator.
  3. A scenario test cannot be included in another scenario test, which is generally the case with other test frameworks that allow atomic level test cases to be included in the main test case.
  4. Scenario tests are portal dependent. i.e., once recorded on a portal, they cannot be run from a different portal. They must run in the same portal on which they are recorded.
  5. CSS styles on hover are not possible to capture now, for e.g., on hover assertions. On hover action is not supported for capturing while recording a test case.
  6. Unlike selenium, Upload/download files is not supported as this need’s interaction with Operating system.
  7. Multiselect control and Timeline view are not supported by Scenario tests
  8. Limited Support for reading/writing data dynamically
  9. Scenario tests do not have support for Setup/Cleanup of test data
  10. For any application not having UI Kit (in app stack), some of the controls (OOTB) will not be recorded or executed.

References[edit]