Skip to main content
🎉 A11y Issue Detection Agent is now live! Detect accessibility issues like a WCAG expert with AI. Try now!
No Result Found
Connect & Get help from fellow developers on our Discord community. Ask the CommunityAsk the Community

Diagnose inconsistencies in issue counts

Issue counts in accessibility tests might show a few variations. This could be due to sevaral reasons including website changes, dynamic content, rules updates, etc. This doc provides a way to understand, remediate, and avoid such inconsistencies.

Inconsistencies could be of the following types:

Website updates

Have you updated the code or content in your website?

Root cause and explanation

If you or your team has deployed any changes to your website, the issues detected on your website can change. This could happen if there are modifications to content, HTML, CSS, or scripts.

Which products show changes to issue count after website updates

All products in BrowserStack Accessibility Testing can show changes to issue count if there were updates to your websites.

These products are:

  • Workflow Analyzer
  • Website Scanner
  • Automated tests

Remediation

Any updates to your website can result in differences in its accessibility issue count. This is an expected outcome as changes might have fixed old issues or introduced new ones.

Retry the scan again to check if issue persists.

If the issue persists, follow these steps to avoid changes to issue count:

  • Verify or check with your team of any recent deployments, CMS content updates, or frontend changes between scans.
  • Check if any additional issues reported are in the components that were recently changed.
  • Fix any new issues by following the instructions provided.

Comparing between tools

Are the differences between scans observed in reports from different products?

Root cause and explanation

The Workflow Analyzer captures real-time DOM states including dynamic and user-interacted content, which may detect more accessibility issues compared to static scans like website scans. So, a difference in the issue count is expected if you compare scans between multiple products.

Which products show changes to issue count

You might see a few changes to issue count between the Website Scanner reports and Workflow Analyzer reports.

Are there differences between same workflow scans?

Root cause and explanation

Differences in issue counts between two Workflow Analyzer reports of apparently the same workflow can happen due to the following reasons:

  • Workflow variations: Scans using the browser extension capture real-time DOM states including dynamic and user-interacted content. Minor differences in the workflow between two scans can cause differences in issue count.

  • Scans aborted due to new page state: When page states change, scan can take some time to complete. This is marked by the prompt “Processing changes, stay on page”. Interacting with the page or going to another URL when this prompt is displayed can abort ongoing scans and impact the number of issues reported.

Processing changes, stay on page

  • Scans completed abruptly After you click “Save Report”, it can take upto 25 seconds for the extension to complete post-processing of the scan.

Which products show changes to issue count between different scans of the same workflow

Workflow Analyzer

Remediation

Retry the scan again to check if issue persists.

If the issue persists, follow these steps to avoid changes to issue count:

  • Workflow variations - When retesting, repeat the exact same workflow to compare with previous scans.

  • Scans aborted due to new page state - Please wait for “Scanner on Standby” state to appear in the extension.

Scanner on Standby

  • Scan completed abruptly - Please wait upto 25 seconds after “Save Report”

Were scan options like WCAG version, Best Practices, Advanced Rules, and Needs Review configured the same way across scans?

Root cause and explanation

Changing scan settings for WCAG version, Advanced Rules, Best Practices, Needs Review, etc. can impact issue detection. If multiple scans run with different settings, a difference in issue count is expected.

Which products show inconsistencies in issue count between different scan options

All products in BrowserStack Accessibility Testing can show inconsistencies in issue count if you run scans with different settings.

These products are:

  • Workflow Analyzer
  • Website Scanner
  • Automated tests

Remediation

Retry the scan again to check if issue persists.

If the issue persists, follow these steps to avoid changes to issue count:

  • Compare scan settings/configuration between runs to ensure consistency.

  • For Website scanner, check local and auth configurations too.

Additional resources

Hidden issues

Did you hide or unhide any issue recently?

Root cause and explanation

When you hide an issue, BrowserStack Accessibility Testing excludes it from the issue count. If you later unhide the issue, it is counted again. So, if you hide or unhide issues between two scans, the issue count differs.

Which products show changes to issue count due to hidden issues

All products in BrowserStack Accessibility Testing can show changes to issue count if you have hid or unhided issues between two scans.

These products are:

  • Workflow Analyzer
  • Website Scanner
  • Automated tests

Remediation

Retry the scan again to check if issue persists.

If the issue persists, follow these steps to avoid changes to issue count:

  • Check if you have hidden issues that are legitimate accessibility issues. If so, consider unhiding them.
  • Check if you have unhided an issue which was hidden earlier. Double-check whether such issues need to be hidden.

Page failure

Did any page fail to load during the scan?

Root cause and explanation

If any page failed to load during a scan due to 404 errors, timeouts or similar issues, it can cause differences in the issues detected. Pages not loaded are not scanned for issues and this can result in a reduced number of issues being reported.

Which products show changes to issue count due to page failure

The following products in BrowserStack Accessibility can show changes to issue count if there were page failures:

  • Website Scanner
  • Automated tests

Remediation

Retry the scan again to check if issue persists.

If the issue persists, follow these steps to avoid changes to issue count:

  • Check for flaky pages in your website that frequently face timeouts and 404 errors.
  • Consider improving load times of your webpages by following best practices.
  • Fix 404 errors in your website by adding redirects.

Any network flakiness on your website, or any VPN configured at your end?

Root cause and explanation

Network drops can cause website load failure and lead to differences in issues reported.

If your VPN does not allow external connections, or if it does not allow web-socket connections, differences in issues counts might occur.

Which products show changes to issue count due to network failure

All products in BrowserStack Accessibility Testing may show changes to issue count due to network failure or the impact of a VPN.

These products are:

  • Workflow Analyzer
  • Website Scanner
  • Automated tests

Remediation

Retry the scan again to check if issue persists.

If the issue persists, follow these steps to avoid changes to issue count:

  • Perform accessibility scans on a stable network.
  • Consider reconfiguring your VPN when running accessibility scans.

Rule engine update

Did the Spectra Rule Engine change after your last run?

Root cause and explanation

The Spectra™ Rule Engine, which works under the hood of BrowserStack Accessibility Testing, is updated regularly to improve issue detection. There could be differences in the number of issues or type of issues detected if you compare scans run on different versions of the Spectra™ Rule Engine.

Which products show inconsistencies in issue count due to rule engine update

All products in BrowserStack Accessibility Testing can show inconsistencies in issue count if there were updates to the Spectra™ Rule Engine.

These products are:

  • Workflow Analyzer
  • Website Scanner
  • Automated tests

Remediation

Retry the scan again to check if issue persists.

If the issue persists, follow these steps to avoid changes to issue count due to an update to the Spectra™ Rule Engine:

  • Check the All Issues tab to see if any of the changes in issue counts are due to the issues marked as new rules.
  • Check the scanner version used for each scan.
  • Review the rule update release log.

Additional resources

Dynamic content

Is there dynamic content in your website?

Root cause and explanation

Dynamic content like ads, popups, and carousels can cause a website to behave differently at different times. Even cookie consent banners and network speed might cause changes in a website. Such differences during different scan runs can cause changes to issue count reported by BrowserStack Accessibility Testing.

Which products show changes to issue count due to dynamic content

All products in BrowserStack Accessibility Testing can show changes to issue count if dynamic content exists in your website.

These products are:

  • Workflow Analyzer
  • Website Scanner
  • Automated tests

Remediation

Retry the scan again to check if issue persists.

If the issue persists, follow these steps to avoid changes to issue count:

  • If you have cookie consent banner on your website, and if you are using Website Scanner, add Cookie Configuration settings when you create a new scan, to avoid differences due to cookie settings.
    Screenshot of a button to add Cookie configuration
  • Manually trigger popups, modals, and dynamic sections before rescanning if needed.
  • If possible, lock dynamic content to a static state before scanning.
  • If the above steps don’t work, consider ignoring differences due to dynamic content.

Are there personalized content based on user location, A/B tests etc. on your website

Root cause and explanation

Geolocation-based content, A/B test variations, or session-based personalization can cause different pages or elements to load during scans. This can cause an inconsistency in the number of issues reported.

Which products show changes to issue count due to personalized content

All products in BrowserStack Accessibility Testing can show changes to issue count if personalized content exists in your website.

These products are:

  • Workflow Analyzer
  • Website Scanner
  • Automated tests

Remediation

Retry the scan again to check if issue persists.

If the issue persists, follow these steps to avoid changes to issue count:

  • Check if personalisation scripts or testing frameworks were active during the scans.
  • Consider standardising scan conditions or disabling A/B testing when you test accessibility.

Other causes

Do you want help investigating a real inconsistency after completing all above steps?

Root cause and explanation

If the issue appears to be not due to any of the other reasons listed, it might need a deeper analysis to find the root cause. If you think that there is a mismatch in issue count because of a reason not listed above, the support team may need to manual review the scans.

Remediation

Retry the scan again to check if issue persists.

If the issue persists, submit both scans that show changes to issue count for manual review through the support escalation process.

Additional resources

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked





Thank you for your valuable feedback

Is this page helping you?

Yes
No

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked





Thank you for your valuable feedback!

Talk to an Expert
Download Copy Check Circle