Skip to main content

Automated test runs

The automated test run is a process of recording the test result from the report generated by the execution of the automated test cases by software tools in a repetitive and data-intensive manner. The automation testing is triggered manually or scheduled on events and actions. The linked test case results are updated in the BrowserStack Test Management tool.

Supported tools for automation test runs

Test Management allows integrating automation test run results from:

JUnit-XML or BDD-JSON based report upload

JUnit-XML or BDD-JSON based report upload in Test Management is supported with multiple frameworks.

You can use CLI commands to upload these reports to see results automatically in Test Runs section.

Test Observability

BrowserStack Test Observability currently supports multiple automation test frameworks including:

Test Observability’s native integration with Test Management, significantly enhances the efficiency and effectiveness of test management processes. This native integration is achieved through the BrowserStack SDK and supports a wide range of testing frameworks and environments, including local devices, browsers, CI/CD pipelines, and cloud platforms.

Test Observability empowers Test Management as follows:

  • Integration with Various Testing Frameworks

    Test Observability supports automated test runs for numerous frameworks like Node.js WebdriverIO, Java TestNG, JavaScript Cypress, Jest, Mocha, Playwright, Nightwatch.js, Java Serenity, C# NUnit, Python Pytest, and JUnit Reports. This compatibility ensures Test Management can read and show Test Observability’s build run details and test cases that are run using any of these frameworks.

  • Setup and Configuration

    The process begins with configuring the project repository. This involves verifying and updating the pom.xml file with the latest BrowserStack SDK, installing the SDK, and configuring the browserstack.yml file with necessary details like BrowserStack username, access key, build name, project name, and enabling Test Observability.

  • Accessing Test Observability data

    To access Test Observability data, log into BrowserStack Test Management and select the project to which the test report was exported. Then, navigate to the Test Runs section and click on the graph icon in the Type of Run column to view detailed observability data.

    Graph icon

  • Integration with CI/CD for Report Generation

    A CI/CD pipeline must be created first for generating and exporting Test Observability reports using any of the pipelines such as Jenkins, CircleCI, Travis CI. After setting up the CI/CD pipeline and pushing the codebase to a version control system like GitHub or Bitbucket, you initiate a test run report through CI/CD tool. Once the test run build is complete, the generated report can be checked in Test Management. For detailed debugging and analytics, you can navigate to corresponding build run in Test Observability from the Test Runs page in Test Management.

Integrating Test Observability into Test Management provides a comprehensive view of the testing process, enabling teams to track, analyze, and optimize their testing efforts more effectively.

Test Observability to Test Management mapping

Due to Test Observability and Test Management’s native integration, the folder structure and associated test cases in Test Observability are copied to Test Management. Each test case is copied along with the title, folder, execution status, execution time, configuration and stack trace in case of failures. The stack trace or the error information is recorded in the response field of the corresponding test case in Test Management.

An example of mapping between Test Observability and Test Management is as follows.

Test Observability

Test Observability structure

Test Management

Test Management structure

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked






Thank you for your valuable feedback

Is this page helping you?

Yes
No

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked






Thank you for your valuable feedback!

Talk to an Expert
Download Copy