Generate test cases using AI
Learn how AI can assist you with creating test cases in BrowserStack Test Management.
Manually creating test cases can be complex and time-consuming. To simplify this task, BrowserStack Test Management offers AI-powered test case generation. Now, you can leverage generative AI to swiftly generate meaningful test cases based on the context you provide, accelerating your testing process and ensuring comprehensive coverage.
Key features
-
Save time:
Quickly generate test cases without the manual effort of authoring test cases. -
Enhanced coverage
You can ensure all scenarios are tested by using AI suggestions. -
Flexible input options to provide context
Use input context to generate test cases through prompts and requirement files.
BrowserStack does not train its AI models on the data you provide.
How to generate AI-powered test cases
- Currently, only one JIRA ID can be submitted per request.
- Context from attachments is ignored when generating test cases from a JIRA issue.
Follow these steps to generate detailed test cases based on what you want to test..
- Navigate to the test cases list view in your project and select an existing test case folder or create a new one.
- Click Generate with AI.
- On the generation screen, you need to provide input to the AI to generate test cases. You can do this by:
-
Uploading a requirements file (See annotation 1):
- Upload a requirements document (example, a PDF file) containing detailed requirements specifications.
- This file is attached to each generated test case.
- You can provide both a prompt and a requirements document for more comprehensive input.
-
Use Jira issue context (See annotation 2):
- Click on Add Jira ID.
- Enter the Host, Project and issue ID to fetch context from the issue description.
For details on how to link Jira issues, see Link Test Cases with Jira Requirement Issues.
-
Using a prompt (See annotation 3):
- Enter a detailed description of the test case scenarios you need. The prompt has a character limit of 30,000 characters.
-
Choose the destination folder where accepted AI-generated test cases will be saved (See annotation 4):
- Click the folder icon next to the folder name at the bottom of the modal window.
- From the displayed folder structure, select the desired folder.
- By default, the test cases will be saved in the currently selected folder. You can change the folder at any time before generating test cases.
-
Uploading a requirements file (See annotation 1):
- Click Generate Test Cases.
- The AI generates test cases across scenarios, with all test cases selected by default and categorized accordingly.
- After test case generation completes, the AI presents multiple scenarios, each with its own set of draft test cases.
- Expand a scenario: Click the arrow (>) next to the test case to scenario to see its individual test cases.
- Review test cases: Deselect any test case you do not need by unchecking its box.
-
View test case details: Click View besides a generated test case to view test case details and make inline edits.
- Add individually: Click Add next to a specific case.
- Click Add Test Cases to include the selected test cases in your suite.
- To discard and start over, click Start Over and choose any method to regenerate test cases.
Test case details are generated asynchronously and may take some time.
BrowserStack AI performs best with PRD’s that are well-structured, detailed, and unambiguous. For optimal performance, adhere to the following PRD constraints:
- Maximum pages: 25
- Maximum size: 15 MB
- Supported formats: PDF, JPEG, JPG, PNG, txt
Files that exceed either limit are rejected and test-case generation will not begin. Split or compress larger PRD’s before uploading.
Now, you have successfully generated and added AI-powered test cases with detailed test case details to your test case repository, tailored to your specific requirements and context.
The following image shows an example of an AI-generated test case along with its details.
Autofill test case details
Test Management’s AI assistant helps you autofill additional details like steps and expected results for new test cases and edit existing ones quickly and accurately. As you type, the AI infers patterns from your previous test cases and suggests how to complete the remaining fields.
While creating, editing or after generating test cases let AI assist you with filling in all the test case details based on existing test cases. Follow the below steps to generate and autofill test case details.
-
Enter Title for your test case and Click Autofill Details icon beside it.
As you begin to fill in test case fields, the AI will offer context-aware suggestions to complete the remaining fields.
- AI assistant will fill in all the details such as:
- Description, Test Steps & Expected Results, Preconditions, Priority, Type of Test Case.
- Click Create or Update to accept and save the AI-assisted changes.
We're sorry to hear that. Please share your feedback so we can do better
Contact our Support team for immediate help while we work on improving our docs.
We're continuously improving our docs. We'd love to know what you liked
We're sorry to hear that. Please share your feedback so we can do better
Contact our Support team for immediate help while we work on improving our docs.
We're continuously improving our docs. We'd love to know what you liked
Thank you for your valuable feedback!