Outstanding QA doesn’t happen by accident — it takes a special combination of experience, creativity, leadership and technical prowess to create a user experience that customers love.
In our recent webinar, How Lyft Drives QA at Scale, Testlio CEO Kristel Kruustük talked to Heather about what it takes to grow QA alongside product and engineering teams. These questions were submitted by webinar participants.
For more info on how Testlio works with Lyft, check out our case study. Take it away, Heather!

What percentage of Lyft’s regression is automated?

We’re still building up our automation frameworks, but we’ve automated about 10% of our regression testing so far.

How does Lyft measure the impact of QA?

Kristel and I have gone back and forth a lot on this topic. At Lyft we look at the number of bugs per release and how many test cases we’re running per release cycle. Instead of focusing just on QA metrics, we also look at app quality metrics broadly. Did we have a hotfix? Were there any late merges? What did we do that was out of the usual process?

We regularly have meetings, called release retros, where we talk about what went well and what went wrong with each release. It helps us identify the cause of any issues that arise.

How does your QA team incorporate user feedback?

We’re looking at customer support tickets that come in all the time. We use tools like Instabug for Alpha/Beta feedback. When a feature request goes into JIRA, we coordinate the work with particular project managers.

We definitely rely on our users for feature innovation — a lot of what we’re planning is based on their requests.

Is there an engineering team dedicated to bug fixes?

Each team — and it varies by project — has what we term an “on-call engineer” who takes care of any requests or hotfixes during the week. We don’t assign any sprint planning work to that engineer.

What is the biggest challenge you face in scaling Lyft’s QA?

Definitely keeping up with all of the projects and new changes that product engineers want to do in terms of new feature testing, like adding a new promotion or partnership. Trying to keep up with unplanned work is also a challenge. We don’t usually deal with one fire drill at a time, it’s more like five fire drills at a time. You take everything one project a time.

Who writes your automation scripts: engineers or QAs?

It used to be all of QA writing those scripts — once we got rid of the old framework and got onto our new native framework, we arranged it so QA and developers work together on writing tests. This means developers will be able to write in their own language.