Skip to main content
Introducing the Automate SDK! Get your entire test suite running on BrowserStack in minutes! Learn More.

Run TestCafe tests in parallel

Speed up testing by running TestCafe tests in parallel

To run your tests in different environments, add a comma-separated list of the desired environments to your command. The syntax for executing parallel tests is as follows:

testcafe "<environment-1>","<environment-2>" <test-script-name>

Following is an example of executing the test functions in test.js on two browsers in parallel:

testcafe "browserstack:chrome@84.0:Windows 10","browserstack:firefox@78.0:OS X Catalina" test.js

In the invocation above, both Chrome 84 on Windows 10 and Firefox 80 on MacOS Catalina are invoked at the same time. All the test functions present in test.js are executed in both environments. In this case, you’d need a BrowserStack plan which supports running two or more automated tests in parallel.

Run different test functions in parallel

In the previous section, we covered running in multiple environments in parallel. But all test functions were still getting executed one after the other. This section will help you run different test function in parallel.

TestCafe provides a concurrency flag i.e. -c n which spawns n instances of the same environment. These instances constitute the pool of browsers against which tests run concurrently, that is, each test function runs in the first free instance. It helps to execute the test faster. The concurrency value can be calculated using the following formula:

Concurrency = No. of parallels / browser combinations

Here, No. of parallels is the parallel limit permitted by your account. The Parallel limit can be found in accounts profile. For example, you have 5 parallels and you want to test on Chrome 84 with Windows 10 and Firefox 80 with OS X Catalina. The concurrency value would be 5/2 = 2

Following is the example of concurrent testing:

testcafe "browserstack:chrome@84.0:Windows 10","browserstack:firefox@78.0:OS X Catalina" -c 2 test.js, where 2 is the concurrency limit.

In this case, tests are distributed across two chrome instances and the same tests are also run in two firefox instances.

Protip:
  • TestCafe is very resource intensive and hence make sure that if your machine is getting slow (choked up) while running testcafe then you upgrade your machine’s specifications
  • A single instance of testcafe is not capable of handling a lot of parallel sessions (number of browsers * concurrency)
  • If you notice your tests getting Browser Disconnected or Session aborted errors, then run multiple instances of testcafe in parallel using a custom script like shown below:
       testcafe "browserstack:chrome@84.0:Windows 10" test_1.js &
    testcafe "browserstack:chrome@84.0:Windows 10" test_2.js &
    testcafe "browserstack:chrome@84.0:Windows 10" test_3.js &
    testcafe "browserstack:firefox@78.0:Windows 10" test_1.js &
    testcafe "browserstack:firefox@78.0:Windows 10" test_2.js &
    testcafe "browserstack:firefox@78.0:Windows 10" test_3.js &
    testcafe "browserstack:safari@13.1:OS X Catalina" test_1.js &
    testcafe "browserstack:safari@13.1:OS X Catalina" test_2.js &
    testcafe "browserstack:safari@13.1:OS X Catalina" test_3.js &
    
  • The above example shows that test functions in test.js has been divided in 3 separate files and 3 separate testcafe instances are used to invoke 3 separate Chrome 84 browsers on Windows 10 rather than using -c 3
  • Also, the above examples show 3 different browsers being invoked but all in different testcafe instances and not as a comma-separated list as was shown earlier in the document
  • Running separate parallel processes with different instances of testcafe has shown better test stability because doing this will ensure that testcafe is not becoming a bottleneck for running your tests
  • Also, make sure that for all the tests above, you are using a single BrowserStackLocal tunnel connection as shown in the section for user managed tunnel connection for better test stability

Parallel testing using test scheduling

Test Scheduling is a functionality supported by BrowserStack. It is not part of standard testcafe version. Read more about using test scheduling flag before proceeding.

The syntax for executing parallel tests using test-scheduling flag:

./node_modules/.bin/testcafe "<environment-1>","<environment-2>" <test-script-name> --test-scheduling

Following is the example of executing TestCafe test on chrome 84 and firefox 78 using test-scheduling:

./node_modules/.bin/testcafe "@browserStack/browserstack:chrome@84.0:Windows 10","@browserStack/browserstack:firefox@78.0:OS X Catalina" test.js --test-scheduling

The preceding command executes each of the test function of test.js as a separate session on BrowserStack specified environments (chrome 79 and firefox 44). In this case, you’d need a BrowserStack plan which supports running two or more automated tests in parallel.

Parallel testing using test scheduling and concurrency

Similar to the above section, you can also add concurrency flag with --test-scheduling as shown below:

./node_modules/.bin/testcafe "@browserStack/browserstack:chrome@84.0:Windows 10","@browserStack/browserstack:firefox@78.0:OS X Catalina" test.js -c 5 --test-scheduling

The above sample invocation will do the following:

  • Create 5 sessions each of Chrome 84 and Firefox 80 i.e. 10 total sessions
  • Each test function in the file test.js will get executed once in Chrome 84 and once in Firefox 80
  • Each Chrome/Firefox session will contain only one test function’s execution (due to --test-scheduling flag)

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked






Thank you for your valuable feedback

Is this page helping you?

Yes
No

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked






Thank you for your valuable feedback!

Talk to an Expert
Talk to an Expert