Skip to main content

Unique Test Cases widget in custom dashboards

Monitor the number of unique test cases in your test suites.

This widget tracks the growth of unique test cases being run. An upward-trending graph represents more automation scenarios being added, and a more complete test suite being built.

The Unique Test Cases widget is a collection of one or more line charts in which the X-axis represents time and the Y-axis represents the number of unique test cases. Each line is a different segment that you can configure to compare different projects, builds, users, etc.

Line chart with dates on the X axis and the number of test cases on the Y axis

In the sample above, there are two segments: Segment A (yellow line) and Segment B (blue line). The number of unique test cases is much higher in Segment A most of the time. This indicates that Segment A probably covers more test case scenarios than Segment B. However, the number of unique test cases in both Segment A and Segment B remains flat in the long term, which indicates that more automation scenarios are not being added. But on 21st January, there is a spike in the number of unique test cases in Segment B, which warrants further analysis.

The insights from such a comparative analysis of the number of unique test cases in different segments could help improve the coverage and effectiveness of your test suites.

Widget configuration - Unique Test Cases

You can configure the following options in the Unique Test Cases widget:

  • Widget name: A suitable name to easily identify the purpose of the widget.
    widget name
  • Description: An optional widget description to explain the purpose in detail. A user can view this description by hovering over an info icon on the widget and gain valuable context about the widget.
    widget description
  • Chart Summary: A toggle to show or hide the chart summary, a concise banner that displays summarized information on your Unique Test Cases widget. By default, the widget displays the total number of unique test cases as the chart summary. You have an option to show or hide this chart summary. Chart summary is available only on widgets with a single segment.
    chart summary
  • Segments: Add up to five segments in the Unique Test Cases widget using the Add segment option. These segments appear as separate line charts in the widget. Segments should be used along with filters. You can use various filters in each segment to compare different projects, builds, users, etc.
    add segments
  • Filter: You can add a filter to include only the data you want in a particular segment. The parameters by which you can filter data are Projects, Unique Build Names, Users, Build Tags, Test Tags, Hooks Visibility, Host Names, Folder names, Device, OS, and Browser.
    filter
    You can also import filters from other widgets to avoid duplicate efforts.

Sample use cases

You can use the Unique Test Cases widget to track and compare the number of unique test cases used in several sections of your testing organization. Here are a few sample use cases to get you started:

Track unique test cases in different modules or by different teams

You can configure separate segments for different modules or teams in your test suites. You can use segments in combination with the following filters to identify modules and teams:

  • Unique build names filter to identify build names that belong to a particular module or team.
  • Users filter to differentiate between team members who triggered the build.
  • Folder names filter to identify modules based on folders in your test repository.
  • Build tags and Test Tags that represent team or module information.

Consider the following example in which the number of unique test cases in three modules is compared.

Unique Test Cases widget to compare modules

Here, the three line charts represent Module A (purple line), Module B (blue line), and Module C (yellow line) in a test suite. Such a graph can quickly tell you that Module A has a relatively high number of unique test cases, Module B tends to have the least number of unique test cases, and Module C often has a moderate number of unique test cases. You can also see that while Modules A and C vary a lot in the number of unique test cases over time, Module C has a relatively stable number of unique test cases. In many cases, you will be able to apply best practices followed by top-performing teams to improve the number of unique test cases in their modules.

To create the above widget, in the following sample configuration, different Folder names filters are configured on each of the three segments that define Module A, Module B, and Module C.

Configuration of unique test cases widget to compare modules

Track unique test cases on different platforms

You can measure the number of unique test cases across multiple devices, OS, and browser combinations using the Unique Test Cases widget. This can be achieved by configuring separate segments for each of the OS-device-browser combinations that you want to track.

In the following example, the number of unique test cases run on three different browsers is compared.

Unique Test Cases widget to compare browsers

Here, the three line charts represent the Unique Test Cases in tests run on Browser A (purple line), Browser B (yellow line), and Browser C (blue line). This graph informs you that the number of unique tests run on Browser C is relatively high and low on Browser B. Browser A tends to have a moderate number of unique test cases. Also, the number of test cases varies widely in Browsers A and C, while it remains more or less constant in Browser B. Using these insights you will be able to concentrate on improving the number of unique test cases in Browser B and reducing the variance in the number of test cases in Browsers A and C.

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked






Thank you for your valuable feedback

Is this page helping you?

Yes
No

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked






Thank you for your valuable feedback!

Talk to an Expert
Download Copy