Skip to main content
No Result Found
Connect & Get help from fellow developers on our Discord community. Ask the CommunityAsk the Community

Test result variables

Accessing detailed test result data is important for continuous integration, real-time reporting, and automated defect management. Our webhooks provide a rich set of test result variables that allow you to capture the granular outcome of each test execution. These variables can be utilized to power custom dashboards, trigger alerts based on failure patterns, or update external systems with the latest test status. This guide outlines the available variables, their descriptions, and examples of their typical values to help you build robust integrations.

Events (Trigger conditions)

Webhooks for test result variables are primarily triggered when a test execution concludes and its results are recorded in the system. These events ensure you receive immediate notifications about the outcome of your tests, enabling prompt actions based on the results.

The following actions will trigger a webhook payload containing test result variables:

  • Test results added: Triggered when a test execution finishes and its result (e.g., Passed, Failed, Skipped, Error) is logged in the system.
Variable name Description / example
TEST_STATUS Final execution status of the test. Example: Passed, Failed, Skipped, Blocked
TEST_RESULT_NOTES Notes or comments attached to this test result. Example: "Login button unresponsive on slow network"
TEST_DEFECTS List of defects or issues linked to this test result. Example: ["BUG-123", "UI-456"]
TEST_RESULT_CUSTOM_FIELDS Object of custom fields and their values defined for this test result. Example if you have a custom field named Estimate: { "Estimate": "2 hours" }
TEST_CONFIGURATION_ID ID of the specific test configuration used in this test run, if applicable.
{{TEST_TYPE}} Category or type of the executed test. Example: Functional, UI, API, Performance
{{TEST_NAME}} Name of the individual test that was executed. Example: "Login with valid credentials"
{{TEST_PLATFORM}} Platform or environment where the test ran. Example: "Chrome on Windows 10", "iOS 16 on iPhone 13"
{{TEST_FILE_PATH}} File path of the test script or definition. Example: tests/auth/login_test.py
{{TEST_URL}} URL linking to the detailed report or execution log for this result.
{{SMART_TAGGED}} Boolean indicating if the result was automatically tagged by an intelligent system. Example: true, false
{{IS_AUTO_ANALYZED}} Boolean indicating if the result underwent automatic analysis (e.g., common failure pattern detection). Example: true, false
{{FAILURE_CATEGORY}} Categorized reason for failure. Example: Application Bug, Environment Issue, Test Flakiness
{{IS_PERFORMANCE_ANOMALY}} Boolean indicating a performance anomaly was detected. Example: true, false
{{IS_MUTED}} Boolean indicating the result is muted/ignored due to a known or pending issue. Example: true, false
{{TEST_TAGS}} Tags associated with this test result or run. Example: ["smoke", "login_flow", "critical"]
{{RUN_COUNT}} Number of times this specific test has executed. Example: 5
{{TEST_DURATION}} Duration of the test execution in milliseconds. Example: 12500 (for 12.5 seconds)
{{TEST_FAILURE_LOG}} Log content or link to the detailed log generated on failure. This can be extensive.

By leveraging these test result variables, you can build powerful custom integrations that react dynamically to your test execution outcomes. Whether you are integrating with incident management systems, data analytics platforms, or custom reporting tools, these variables provide the data points needed for comprehensive automation. If you have any further questions or require additional variables, refer to our full API documentation or contact our support team.

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked





Thank you for your valuable feedback

Is this page helping you?

Yes
No

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked





Thank you for your valuable feedback!

Talk to an Expert
Download Copy Check Circle