Defining Pega scenario tests

From PegaWiki
Defining Pega Scenario Tests / This is the approved revision of this page, as well as being the most recent.
Jump to navigation Jump to search

Defining Pega scenario tests

Description Best practices for defining scenario tests
Version as of 8.4
Application Pega Platform
Capability/Industry Area DevOps

This content is being retired in the next few months. Please use our documentation site to find information you may need.


Scenario tests in the test pyramid

Pega scenario tests allow application developers and testers to create automated end-to-end tests cases that exercise Infinity applications through the user interface. Scenario tests fit into the top of the test pyramid where you start to trade off increasing complexity and time invested vs the overall coverage achieved by the test. Scenario tests are a great way to quickly create acceptance tests for Pega Infinty applications, as a complement to a good suite of PegaUnit tests and functional tests .

Starting with 8.5.2, scenario testing is made available as a separately releasable Pega Test Automation Kit  available in Pega Marketplace. All new enhancements to scenario testing are made available through this component.


  • Make sure you have the privilege pxScenarioTestAutomation a part of your access roles.
  • Make sure the DSS setting pzPegaSUT is set to true.
  • For scenario test creation, an unlocked ruleset must be available in the application stack/branch. This ruleset is used to save the test cases recorded.
  • To display the scenario test results in the Application Quality Dashboard, the test ruleset must be available in the application stack as defined in the quality settings.
  • While saving the test cases in a ruleset, make sure that the ruleset is marked as a test ruleset. To do this, switch the Test Automation settings on the Security tab of the created ruleset.
  • Make sure that all UI components have unique data-test-ids generated. This is generated by default if an applicated resides in the latest version of Pega using OOTB components.

Best Practices[edit]

  • Scenario tests can only be captured in the context of the application portal. Scenario tests cannot be recorded from Dev Studio, App Studio, or any of the development portals.
  • Start the recording slowly. Wait until the page or section refreshes completely before recording the next step.
  • Wait for any activity associated with a user-click action to refresh before performing a subsequent click action.
  • Users can add explicit assertions by clicking the + icon within an orange highlighter. Once the user clicks the + icon, an overlay with all the possible assertions displays and is available from drop-down lists.
  • Create scenario test suites as part of the purpose-specific tests, such as smoke tests suite, regression tests suite, etc.
  • Users can disable individual scenario tests for an application so that they are not executed when the test suite runs.
  • Data-test-ids should be unique. If any copy-paste is carried out for some section, make sure the data-test-ids is generated again for that new created section to be unique.
  • Capture the custom UI controls by registering a specific control as a custom component to Pega-UI-Automation-Extension.
  • Supported elements are highlighted during the recording. If any element is not highlighted, it would mean that element is not supported and hence would not be recorded. Non-supported OOTB controls include Multiselect and Timeline view.
  • Existing OOTB controls that are re-saved with the same name to perform additional actions are highlighted in the recording, but they do not work. To resolve this issue, register them as a custom component to Pega-UI-Automation-Extension.
  • Debugging scenario tests:
    • Debug mode on and execute: You can edit during debugging to test better.
    • Apart from UI steps, review the run-time console logs that list the steps and detail which of them passed or failed.

Dos and Don'ts[edit]


  • Wait until the step updates in the right recording panel.
  • If something goes wrong while recording, cancel and rerecord steps.
  • Close the work item tab in the interaction portal after you are done with the test recording or record the closing of the tab as part of the step. Reset the portal by closing the interaction or the case items created.
  • Log in and create case types manually before executing tests so that pages are cached and render fast. This needs to be done if the server cache has been cleared and the server is restarted.
  • Collapse the right panel while recording if any element that needs to be recorded is behind the panel. After recording, expand the right panel again so that steps getting recorded can be seen.
  • If there is an element that shows on mouse hover that there is no unique selector, then it means that the data-test-id for that element is not unique. Generating the test ID again should make that element recordable.


  • Do not rush through elements while recording elements involving AJAX calls, such as cascading drop-down lists, refresh actions, etc. Just wait for the UI to get updated and right panel to get refreshed with the step.
  • Do not use autofill to enter data in forms.
  • Do not update data-test-ids for any element in an existing section because it would fail the existing scenario tests. If you need to update the data-test-id, recreate or update the test case.


  • Scenario testing can only be run in the requester context. That is, a user has to log in to Pega Platform and run the scenario tests.
  • Unlike Selenium, scenario tests cannot be run using different personas or logins. As we are tightly coupled with the requestor, once the user logs out, the scenario testing ends, and we will not be able to continue running the same test with a different operator.
  • A scenario test cannot be included in another scenario test, which is generally the case with other test frameworks that allow atomic-level test cases to be included in the main test case.
  • Scenario tests are portal-dependent. That is, once recorded on a portal, they cannot be run from a different portal. They must run in the same portal on which they are recorded.
  • CSS styles on hover, such as on-hover assertions, are not available for capture now. On-hover action is not supported for capturing while recording a test case.
  • Unlike selenium, file upload and download are not supported as this requires interaction with the operating system.
  • Multiselect control and Timeline view are not supported by scenario tests.
  • Limited support for reading and writing data dynamically.
  • Scenario tests do not have support for setup or cleanup of test data.
  • For any application without a UI Kit in app stack, some of the OOTB controls will not be recorded or executed.
  • Scenario Tests can only be executed in a pipeline using a third party test runner such as Selenium Grid, or supported third party testing services such as SauceLabs, BrowserStack and CrossBrowserTesting.