Needed to build and train a QA team and equip them with the proper tools to succeed.
When Ron Timoshenko joined ConsumerAffairs as a Front End Engineer five years ago, the Tulsa-based team was just 15 people. Today, they’re approaching 200.
Offering a full package of resources to help consumers and companies make better buying decisions, ConsumerAffairs is a far cry away from where they began in 1998. Today, they have evolved from a simple review site into a full-blown marketplace, media property, and software as a service platform.
That kind of dramatic growth brings new challenges. Shortly after hiring Timoshenko, ConsumerAffairs was in need of a product or project manager who could orient the team around the complete package they were now delivering.
Timoshenko was promoted to the company’s first product manager role. After squaring the team, he then returned to his engineering role where he was tasked with organizing and streamlining the engineering team.
“We needed to quickly establish a QA team. Our engineers were not really properly testing code,” says Timoshenko. “Engineers want to solve problems. They don’t really want to test 100 variations of that problem.”
With a limited initial budget, Timoshenko was forced to get by as cost-effectively as possible. Initially, he partnered with a company that allowed his team to write generic, loosely defined test cases for real human beings to run through. Essentially, the team was crowdsourcing their QA. And it was an approach that worked…for a while.
“Every time we would run these crowdsourced QA tasks, a large percentage of them would come back with false positives. While it served some of the needs we had at the time, it wasn’t a perfect solution,” says Timoshenko. “We wanted to run a lot of automated tests and parallels across different parts of the code base. We still couldn’t do that.”
Based on his vision of the team’s future, Timoshenko decided to develop and train an in-house QA team. To do so, he needed tools and applications that could support the team and its goal of automated parallel testing.
When other testing platforms proved to be inadequate, BrowserStack Automate checked off every requirement.
To develop his internal QA team, Timoshenko had to leverage his limited resources, along with the two people who had automation experience, in the most efficient way he could.
“I had one team member provide a framework that the rest of the QA team could learn from. From there we paired up to learn how to write automated test cases. Once people could immediately apply that knowledge, we began building out the framework and the test cases we needed,” he says.
Once he’d oriented the team, and having experienced the difficulties of a limited QA budget, Timoshenko set his sights on future scalability. And an increased testing budget.
“Basically, we did the research. That was our first exposure to BrowserStack,” Timoshenko says. “Initially, we were relying on BrowserStack as a manual cross-browser testing tool, with limited usage. We weren’t utilizing it to its full potential, but we wanted to. Unfortunately, I couldn’t get the finances to pay for it, so we went with another company. It was terrible. The level of support, the maturity in the product, nothing was as good as BrowserStack.”
Months later, after struggling to develop the testing cadence he’d been dreaming of, Timoshenko set a meeting with his CTO.
“I told him, ‘We’ve wasted money with this exercise. I want to move everything over to BrowserStack. They meet all of our requirements.”
His CTO agreed and Timoshenko’s team moved all testing operations to BrowserStack’s automation platform.
“The old process would take a couple of hours on average, going back and determining what was real versus fake, being able to pinpoint where the actual problem was, debugging, and then patching and releasing. Now that we have BrowserStack to automate tests, we have a lot more coverage. It’s night and day, really,” says Timoshenko.
It’s not all about the automated tests for Timoshenko either. He also likes the reports and seeing the instant results of his team’s tests.
“The UI BrowserStack has is just the best compared to any other product we’ve tried. I like the way information is presented on the dashboard and even through the terminal. It allows us to build a way better process,” says Timoshenko.
“We needed something that could run every test case in parallel, give us the results quickly, and enable us to work effectively on releases. We have that with BrowserStack.”
An empowered QA team, hundreds of nightly parallel tests, and more confident code releases.
Although his QA team has grown to four times its original size in just a couple of years, Timoshenko credits BrowserStack Automate for their ability to scale without much effort.
“We run hundreds of test cases overnight with Automate, so we’re constantly maxing out BrowserStack. Because of that, our QA team is really empowered to test code at any point in time,” Timoshenko says. “We find far more bugs prior to releasing to production, which helps us deliver higher quality code much more quickly than we did in the past.”
Timoshenko estimates that his team is now catching three times more bugs with BrowserStack than they were with any other automated testing platforms. And they’re testing more browsers too.
“Before Automate, the regularity of our browser testing was certainly inconsistent,” Timoshenko notes. “Now, we look at fifteen browser combinations at a time. BrowserStack empowers us to do that.”
Timoshenko sees his team continuing to improve their tests with BrowserStack Automate. With greater efficiency, and selective and regression testing on the horizon, ConsumerAffairs is looking forward to their future.