Skip to main content
BrowserStack now supports Cypress testing on WebKit, Safari’s browser engine. To get started click here.

View Test Results

View your Cypress test results on the Automate Dashboard.

Protip: Starting CLI v1.6.0 and higher, if you run tests in the sync mode, test results are automatically saved to the results folder as HTML and JSON files. You can attach these files to your CI runs.

Overview

After running your Cypress tests on BrowserStack, you can view the test results on the Automate dashboard.

The tests are grouped by their build names and further combined into test spec and browser combinations (OS, OS version, browser, and browser version) you selected to run your tests on. The Automate dashboard displays test results, video, and a screenshot of the error in case of a failed test.

You can view the results of the tests, as well as debug the tests using the different logs that are accessible from the dashboard.

Build view in Automate dashboard

The Automate dashboard displays a list of builds that have run through your organization’s account as shown in the following screenshot: Automate Dashboard Build View

  1. Sidebar: Enables quick access to all your builds. The sidebar also displays meta-information such as number of sessions, status of sessions, and project name.

  2. Builds filter: Filters your builds by user, project, framework or status. As you apply filters, the dashboard URL changes, enabling you to share filtered views with other team members.

  3. Global search: Searches for projects, builds, and sessions by their name or IDs.

  4. Build name: Shows the build name and project name.

  5. Build level filter: Filters the sessions in a build by spec status, OS, or Browser.

  6. Build meta: Provides useful information about the build, such as, build status, build duration, number of parallels used, build ID and the name of the user who executed the build.

  7. Duration: Represents the total amount of time needed for the Cypress build to finish on BrowserStack. Hovering on Duration displays a detailed breakdown of the time lapsed. The time distribution is normalised to one parallel/thread when the build is run on multiple parallels.Build Duration These details include:
    • setup_time: Time spent in setting up the machine, such as installing the required npm packages, downloading specs, setting up local connection, etc.
    • test_run_time: Time spent in running your tests.
    • misc_time: Time spent on tasks other than setup and test runs. This includes the time spent in queue due to parallel unavailability, and the time any of the parallels were not utilized for the current build despite availability, which occurs when one or more threads take longer to complete the execution while other threads are idle.
  8. Parallels and queued sessions: Displays the total number of used parallels and queued sessions.

  9. Build level search: Searches specific sessions in a build using the build-level search. You can search for a session by its name or ID.

Session view in Automate dashboard

After you click on a particular session in the build view, you will be taken to the session details view of the Automate dashboard as shown in the sample screenshot below: Automate Dashboard Session View

  1. Session name: Displays the spec name. Click Build to return to the builds view.

  2. Session meta: Provides helpful information about the session, such as Browser, OS, Duration, Local testing on/off, Session ID, timestamp, and the user’s name who executed the session.

  3. Logs: Select Text Logs, Console Logs or Screenshots, to view all the steps executed in the test, browser’s JavaScript console output, or the error message captured when the build failed.

  4. Session Video: Captures the test recording as it happens in the session. Use this recording to go at a precise point in time when an error occurred and debug.
    Set video to true in the cypress.json file (for Cypress 9 and earlier) or the cypress.config.js file (for Cypress 10 and later) to enable videos for your Cypress tests.
    Some points in the progress bar of the recording may be marked in Red or Yellow during the test run, indicating issues. Each color represents the following issues:
    • Red:
      • BrowserStack-specific errors such as SO_Timeout
      • Console annotator marked as Error
      • Session marked as failed using JavaScript executor, API, or SDK
    • Yellow:
      • Any log that is tagged as a Warning
      • Console annotator marked as warn
      • Selenium exceptions
  5. Input Capabilities: Shows a well-formatted view of the input capabilities that you supplied. These are visible by default and are searchable using the browser’s default find feature.

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked






Thank you for your valuable feedback

Is this page helping you?

Yes
No

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked






Thank you for your valuable feedback!

Talk to an Expert
Talk to an Expert