Data driven testing
Run test against multiple scenarios
The data driven testing feature allows you to run a single test against multiple scenarios using different test data combinations. This is achieved by creating and utilizing test datasets.
Examples:
- Testing a login form with various usernames and passwords to verify authentication.
- Testing a search feature with diverse keywords to validate results and handling of invalid inputs.
- Testing a registration form with varying user inputs to ensure data validation and acceptance criteria.
- Testing a shopping cart with different product combinations and quantities to assess calculations and error handling.
- Testing a checkout process with diverse payment methods and shipping addresses to verify functionality and edge cases.
Create test dataset
You can create a test dataset in one of two ways:
Choose the option that best aligns with your workflow and data requirements.
Upload a CSV file
- Click Test dataset from the desktop app or web page.
- Click Add Test Dataset to upload a csv file.
- Select a CSV file containing your test dataset.
- Ensure the file meets the following requirements:
- CSV format
- Consistent number of columns in each row
- Values not exceeding 1000 characters
- Maximum of 100 rows and 40 columns
- At least one row of data in addition to the header row
- Give the dataset a unique name.
Connect to your database
You can create test datasets directly from your database connection allowing you to execute tests with dynamic data combinations.
To create a test dataset from a database:
- Click Test dataset on the left pane.
- Click + Add Test Dataset and select Create from database.
- Select a database connection from the list and click + Connect.
- Alternatively, you can click + Add new connection to configure a new one. You can also create new connections from the Database section of the Low Code Automation. For more information, refer to the Run database query document.
- Enter your SQL query to fetch the required test data. Click Run Query to preview the query results.
- The query results display in a table. Enter a unique name and description for your dataset, then click Save dataset.
Run data-driven tests using real-time data from your database to ensure your test scenarios are always up to date.
Review the test dataset
Once the dataset is created successfully, you can verify the data by looking at the data table.

Click the Refresh icon to update the dataset with the latest data from your database.

To view additional information about the query, click Query details.
Optionally map a scenario name column
- Once a test dataset is created successfully, you can map any column from the dataset as scenario column
- It is used in build report and other places as a label to represent the particular row
- We recommend using the Map a scenario name column to a unique identifier or any column that uniquely identifies each row in your dataset.
Using test dataset in tests
You can import any column from the dataset, similar to importing variables.
A test can only import variables from a single test dataset.
- Within a test, navigate to the step details.
- Click the + button and choose Import a value from test data.
- Select or link a dataset from the list. If you do not have any dataset created, you can create a new test dataset here by uploading a CSV file.
- Choose the column you want to use from all the columns in the test dataset that you chose in previous step.
Test execution with test dataset
Local execution
Local replay automatically uses the first row (excluding the header) from the dataset.
Cloud execution
- Run the test against each row of data in the test dataset.
- Build reports list each execution individually.
- Multiple executions of a test are grouped in the report. Observe the highlighted text, which represents the scenario column data previously mapped to the test.
- For datasets sourced from a database, the query executes at the start of the build to fetch the latest data for execution.
We're sorry to hear that. Please share your feedback so we can do better
Contact our Support team for immediate help while we work on improving our docs.
We're continuously improving our docs. We'd love to know what you liked
We're sorry to hear that. Please share your feedback so we can do better
Contact our Support team for immediate help while we work on improving our docs.
We're continuously improving our docs. We'd love to know what you liked
Thank you for your valuable feedback!