Clari achieves 95% test stability with BrowserStack AI-powered Test Reporting & Analytics

“BrowserStack Test Reporting & Analytics dashboard was a breakthrough with historical patterns, top errors, flaky tests, everything surfaced without manual aggregation.”
Shiva Srinivasan Principal Test Engineer, Clari
Testimonial Video Testimonial Video
Ready to try BrowserStack?
Join over 6M developers & 50K teams across 135 countries.
Industry
Software Development
Location
Sunnyvale, CA

Introduction

Clari, the leading AI-powered revenue orchestration platform, transforms how enterprise organizations manage their revenue processes. With a company philosophy deeply rooted in leveraging data, Clari was committed to applying the same data-driven rigor to its internal quality assurance process. However, the quality engineering team lacked the necessary visibility to effectively measure and improve their testing efforts. To standardize testing practices, accelerate releases, and gain visibility into their release quality, Clari partnered with BrowserStack, focusing primarily on its AI-Powered Test Reporting and Analytics product.

The challenge

Fragmented results, low stability and slow releases

Before adopting BrowserStack, Clari’s quality engineering team faced several critical challenges stemming from a fragmented and undocumented testing ecosystem:

    • Absence of measurable metrics: Automated tests were running, but there was no data on their efficiency, speed, or quality. As Shiva Srinivasan, Principal Test Engineer, states, “There was no way for us to really understand how good our quality process was.”
    • Quality silos: Test results were reported to systems accessible only to quality engineers, making testing a “black box” for the wider development and leadership teams. This lack of transparency prevented the establishment of a clear quality “North Star.”
    • High flakiness: The lack of awareness and standardized processes resulted in chronic instability, with test stability rates hovering around the 60% mark.
    • Slow regression cycles: Running the full regression suite required 3 to 4 hours, severely limiting release velocity and making on-demand hotfixes resource-intensive.
We had automated tests running, but there was no way for us to know how efficient they were, how fast they were running, or if there were bottlenecks. Without measurable data, finding out what existed and finding out where we wanted to go was not possible.
Shiva Srinivasan Principal Test Engineer at Clari
The solution

All-in-one test reporting, debugging & analytics

Clari assessed multiple vendors but chose BrowserStack for the superior and highly flexible data unification offered by its AI-Powered Test Reporting and Analytics (TRA) product. TRA provided a single pane of glass for all testing efforts, regardless of where they were executed:

    • Comprehensive test ingestion: TRA enabled the team to flow results from end-to-end UI tests (executed on BrowserStack Automate and App Automate) and API tests (executed on internal servers) into one unified dashboard.
    • AI-powered Root Cause Analysis: The platform immediately provided value by automating troubleshooting through pattern recognition. The dashboard highlights the “most common failures,” allowing the team to quickly identify and address root causes (e.g., “80% of tests fail because of X exception”).
    • Historical and traceability artifacts: TRA’s historical analysis allows engineers to pinpoint the culprit commit responsible for a regression by checking when the test last passed. The platform also provides critical debugging artifacts like recordings and network logs.
    • Strategic quality auditing: Clari leverages two key metrics from TRA to audit quality across internal application teams. The first, flakiness quotient, is tracked to ensure teams are consistently reducing unreliable tests, leading to automated suites that require less manual intervention. The second, unique test cases, monitors growth to ensure adequate test coverage is being maintained and scaled as new features are delivered.

Clari is further extending its quality ecosystem by adopting Test Management to integrate manual testing effort and planning, with future plans to report Unit Test results directly into TRA. This provides a “full picture” of the quality journey from inception to production.

No other vendor had anything even remotely similar. The unified dashboard was a breakthrough with historical patterns, top errors, flaky tests, everything surfaced without manual aggregation.
Shiva Srinivasan Principal Test Engineer at Clari
The impact

Improved automation stability and gated deployments

The implementation of BrowserStack transformed Clari’s QA process from a bottleneck to a business enabler:

    • Massive speed increase: Test execution time was slashed from 4 hours to 30-35 minutes—a 90% reduction. This acceleration enabled Clari to run full regression suites for every patch or release, significantly increasing deployment confidence.
    • Elevated stability: Stability across the product suite soared from 60% to 95%, validating test results and making the suite a reliable gate for deployments.
    • Enforced gated deployments: Based on TRA’s real-time dashboards, Clari now enforces a mandatory quality gate, blocking deploys unless the most current builds pass. This was “impossible to enforce” before.
    • Faster, data-driven troubleshooting: The ability to see failure patterns and historical context reduced troubleshooting time by more than 50%, accelerating the overall time-to-market for new features.

The impact extends beyond metrics. For individual contributors, the visibility into network logs and pattern analysis means less time spent debugging. For managers, TRA provides a clear, measurable quality bar and a means to compare quality progress across similar teams. And for executives, it offers unified dashboard into the entire quality pipeline, ensuring high standards are maintained.

Test Reporting & Analytics basically allows teams to be able to be their own quality gatekeepers where they can see the data, they can see the metrics, and they can basically come up with their own plans on how they can keep scaling the quality summit. And it also allows all the execs to look at how different teams are doing because there’s ways you can compare very similarly placed setup teams to see how their quality journey is.

Shiva Srinivasan Principal Test Engineer at Clari

What will your team do with BrowserStack?

Over 6M developers & 50K teams already test on BrowserStack. Join them.

View pricing