Skip to main content

Manual Test Runs

A Test Run is a collection of instances of Test Cases with necessary information like who is handling the involved Test Cases at a particular time and the state they are in which can be untested, passed, failed, blocked, skipped or to be retested.

Create manual Test Runs

Follow these steps to create a Test Run manually:

  1. Navigate to the project and click Test Runs on the left navigation panel. Create manual Test Run
  2. Click Create Test Run. Create test run manually A Create New Test Run dialog box appears, and the current date appends to the Test Run Name field.
  3. Click Update in the Test Cases field to select the Test Cases to add to the Test Run. Selecting and adding test cases
  4. Click checkbox against the required Test Cases or click the Select All checkbox to add all the Test Cases to the Test Run. Selecting and adding test cases
  5. Apply State, Owner, Priority, Tags, Test Case Type and Automation Status filters to narrow Test Cases. Enable Link Tests Dynamically for Dynamic Selection of Test Cases. Selecting and adding test cases
  6. Click Select <number> Test Cases.
  7. Enter Configurations, Description, Assign Rule, Tags, State and Jira Issues parameters as necessary. Enter other parameters
  8. Click Create Run.

The new Test Run appears in the Test Runs list view.

Add status

  1. Click ID or TITLE of the Test Run to open the executed Test Run. Open the Test Run
  2. From the Status dropdown menu, select the Test Run status. Add status
  3. To add or modify a Test Case or to connect it to a JIRA issue, click the Test Case ID or TITLE. Then, you can perform the required action on the slide over menu. Add result to the Test Case

Add a result for each step in a Test Case

If you use the steps template for a Test Case, you can add the result of each step. It allows you to track the exact step that caused the failure of the Test Case.

Follow these steps to add test results to a Test Case in a Test Run:

  1. Navigate to the relevant Test Run.
  2. Locate and click the Test Case with a steps template to which you wish to add step results. Test case details view The Test Case details view appears.
  3. Based on the result of a step, assign the appropriate status to it. The options are:

    • Pass: The step was executed successfully without issues.
    • Fail: The step encountered issues and did not execute as expected.
    • Skip: The step was intentionally not executed.
    • Blocked: The step could not be executed due to external factors or dependencies.
    • Retest: The step needs to be executed again, typically after a failure or other issue.

Set the result You can also assign Untested to a step and reset the step result.

Automatic Test Case status assignment

The status of the Test Case is automatically determined based on the outcomes of the individual steps according to a predefined result mapping logic.

Step result Test Case status
All steps are set to Pass Passed
One or more steps are set to Fail Failed
None of the steps have failed, but one or more are set to Blocked Blocked
None of the steps have failed or blocked, but one or more are set to Skip or Retest In Progress

Add notes and attachments to Test Case result

You can add notes and attachments to the Test Case while adding the result.

  1. Select a Test Case in Test Runs.
  2. Click Add Result under Results tab in Test Case DETAILS slide over. Add Results tab in Test Case Details slideover
  3. Enter Add Notes and upload a file in Attachments. Then click Add Result to add result to the Test Case with notes and attachments. Add Notes and Attachments to the Test Case result

Test Run state

You can check the state New, In Progress, Under Review, Rejected, Done, or Closed of the selected Test Run, as shown in the following screenshot. Add result to the Test Case

Dynamic selection

Dynamic Selection enhances the efficiency and comprehensiveness of creating Test Runs. When you enable Link Tests Dynamically in a Test Run, it automatically includes relevant Test Cases based on filter criteria. All future Test Cases that match the predefined filter criteria of a Test Run will be added automatically.

Dynamic Selection ensures that no relevant Test Case is skipped. This saves time and effort by eliminating the need to add each Test Case manually to a Test Run.

  • Suppose you update a Test Case automatically added to a Test Run through Dynamic Selection. In that case, it will remain in the Test Run even if the Test Case does not match the Dynamic Selection criteria anymore.
  • You can manually add Test Cases to a Test Run even when Dynamic Selection is enabled.

Create a Test Run with dynamic selection

  1. Create a new Test Run or edit the existing Test Run.
  2. Click Update in the Test Cases field.

    Update Test Cases field

  3. Specify the filter criteria for including Test Cases in the Test Run.

    Select filter criteria

  4. Click Apply Filters.

  5. Enable Link Tests Dynamically.

    Dynamic selection

  6. Click Select Test Case.

This enables the automatic inclusion of Test Cases in a Test Run based on the applied filter criteria.

When the Dynamic Selection is enabled in a Test Run, the number of selected Test Cases icon changes to a dynamic notification.

Clone Test Runs

To reuse the existing active or closed Test Runs, clone the Test Runs.

To clone a Test Run:

Click the kebab menu of the Test Run you want to clone, and select Clone Run.

Open Clone Test Runs

Enter Test Run Name.

Clone Test Runs

Select the test cases to include in the clone.

  • All test cases.
    It shows the number of test cases in the original Test Run. All these test cases, by default, will be added in the Clone Test Run.

  • By test case results.
    Based on test case status, it allows you to select any one or a combination of passed, failed, retested, blocked, skipped, or untested test cases.
    If you do not select any test case, it creates an empty clone of the Test Run without a test case.

Test Runs

Select from the below options if you want to include information from the original test run.

  • Copy Test Case Assignee to new test run
    Copies the assignee names associated with different test cases selected for the clone test run operation. If you leave it unchecked, all the test cases will be set as unassigned.

  • Copy Tags to new test run
    Copies the tags associated with different test runs selected for the clone test run operation.

  • Copy Linked Issues to the new test run
    Copies linked issues from different test runs selected for the clone test run operation.

    Test Runs info

Click Clone.

A pop-up appears, indicating the Test Run is cloned. The Clone Test Run appears under Active Runs in the Test Runs Dashboard irrespective of whether the original Test Run is Active or Closed.

Search and filter Test Cases in Test Runs

You can query the list of test cases displayed on the dashboard using either the Test Cases ID or the Title as search parameters.

Search Test Cases with a keyword in search bar

Filter Test Cases in Test Runs

Filtering for Test Cases in a Test Run allows you to narrow down Test Cases based on specific criteria. It helps you identify relevant Test Cases quickly.

Follow these steps to filter the Test Cases in a Test Run:

  1. Select a Test Run. Select a Test Run
  2. Click Filter. Click FIlter Test Cases in Test Run
  3. Select the necessary filter criteria such as Status, Configurations, Priority, Assignee and Test Case Type. Filter Test Cases in Test Run
  4. Click Apply.

You can see the Test Cases in the Test Run that meet your filter criteria.

You can also perform bulk actions on seleted Test Cases:

  • Add Result to add same result to the selected Test Cases.
  • Assign to to assign the selected Test Cases to a different user.
  • Remove from Run to remove the selected tests from the partcular Test Run.

Bulk actions

Search and filter Test Runs

You can query the list of test runs displayed on the dashboard using either the Test Runs ID or the Title as search parameters.

Search test runs

Filter Test Runs in the dashboard

Applying a filter to test runs allows you to narrow down test runs based on specific criteria. It helps you identify relevant test runs quickly.

Follow these steps to filter the test runs:

  1. Select the Test Runs tab in the project dashboard.

    Project dashboard

  2. Click Filter at the top right of the screen.

  3. In the Filter Active Test Runs view, select the filter criteria such as Created, Status of Test Case, Assigned To and Type of Run to narrow your search. Filter dialog box
  4. Click Apply.

    Filter results

You can see the test runs that meet your filter criteria.

When you apply filters to test runs from the dashboard, if the Status of the Test Case criteria is used, it will also be applied within individual test runs.

View Test Cases in a Test Run detail page

You can view test cases in a test run in two formats:

  • Folder view
  • Consolidated view

Folder view

The folder view is the default view and has a nested hierarchy of the main folder and sub-folders. To view the test cases linked to the test run, navigate to the individual parent folder and sub-folders. If you apply the filter, the display is refined to show only those folders containing test cases that meet the specified filter criteria.

Folder view

When you search for Test Cases in a Test Run, the folder view automatically converts to a consolidated view and displays the results.

Consolidated view

The consolidated view offers a comprehensive list of all test cases from across all the folders linked to the test run.

Consolidated view

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked






Thank you for your valuable feedback

Is this page helping you?

Yes
No

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked






Thank you for your valuable feedback!

Talk to an Expert
Download Copy