Improving design quality in a fast growth company

MarcoSuarez
UX Collective
Published in
6 min readFeb 22, 2019

--

In January of 2019 I started a consultancy to help companies find success in design systems and improve the quality of their customer experience. Before that, I led design systems at InVision. Below are learnings from a project to improve overall design quality in InVision’s web product.

InVision has a robust design system and high standards for the quality of its customer experience. So when visual bug tickets began accumulating in our backlogs, we knew we needed to investigate and find out why. How do companies in times of fast growth reduce these “paper cuts” and improve design quality without adding a lot of oversight or policies to police?

Start with internal interviews

Ill-informed decisions run the risk of adding process and oversight to treat a symptom while never discovering the actual problem. Though I had my own hypotheses, I knew I needed to create an informed understanding of the situation. So I interviewed designers, engineers, product managers, design directors, and engineering managers to gain an understanding of their process and how they worked with their teammates. I tailored the questions to the person I was interviewing to make sure they were relevant to them and their situation.

  • How do you QA your own work?
  • What do you deliver to engineering to ensure design requirements are understood and details aren’t missed?
  • How does your team prioritize visual bugs?
  • Are you able to review work before it’s completed?
  • Is your team aligned on the definition of “done”?
  • What can design be doing better to ensure what’s designed is what’s getting built?

I let the conversation flow naturally. There’s a good chance there’s information you don’t know you need until someone tells you.

After synthesizing the results, a pattern began to emerge. We wanted to improve our output, but our output wasn’t the problem.

Align on done

It was clear that teams did not share the same definition of “done.” An abstract definition like “functional” wasn’t sufficient because it leaves too much freedom of interpretation and doesn’t consider the customer experience. To know what done means, our teams needed to first agree upon what it is they were building.

Jon Dobrowolski, Senior Director of Product at InVision, became my ally. His insight was invaluable for crafting a solution that worked holistically, and not just from a design perspective. It was also important that when it came time to implement a solution, it didn’t come from just design, but from an aligned EDP (engineering, design, product) leadership.

Write acceptance criteria

The Acceptance Criteria (AC) are “the conditions that a software product must satisfy to be accepted by a user.” They’re that ticket’s definition of done if you’re working in an Agile environment. Before the first commit, before the first art board is saved, the acceptance criteria is stubbed out. Providing a well-written AC will ensure everyone involved understands the expectations of that body of work. The goal of all parties is to satisfy the requirements of the AC. Whether or not a ticket is done is not determined by the subjectivity of a person, but by whether the requirements in the AC are satisfied. If they’re not, the ticket isn’t finished.

Create user stories

Jon introduced to me to the idea of using user stories in our acceptance criteria. A user story is a simple statement that includes the phrase:

As a [role], I want [goal/desire] because [benefit].

And we included scenarios to strengthen the story:

Given [initial context], when [event occurs], then [ensure some outcomes].

Writing user stories with scenarios provided a succinct and standardized way of defining the acceptance criteria for that body of work. Now that everyone understood what it is they were building, they had a clear vision for what done looked like.

Product Review

Product review is the step in the development process to assess whether the work completed meets the AC and is in fact, done. This step is different from quality assurance (QA) or peer review. Those meet different needs. Peer reviews look at code quality and QA assesses durability. Product review checks to see if the thing we set out to build is actually what we built. If it’s not, the work is sent back to in-progress. Only once the AC is satisfied, does it proceeds to the next step.

It’s imperative that product reviews are not conducted by the individual who did the work. The more visibility the work has within the team, the stronger the work will become. We often aren’t aware of our own blind spots or biases, so having multiple perspectives assessing our work will uncover things that may otherwise be overlooked.

It’s also important to uphold the standard of quality during the review process. In a fast growth environment, work is rarely revisited post launch. And though being strict during the product review adds more time to the development process, it removes the need to create additional tickets for fixes. It’s faster to get it done in the moment than to revisit it later.

In Jon’s words, if something makes it through this rigorous process, it can be assumed it has the full blessing of the designers, engineers, and product managers responsible for the work.

Focus on the work

The difference this process has over others is it puts emphasis on the work itself, and not an arbitrary metric like polish time or number of bugs squashed. If a team is asked to spend x amount of time in polish, the team will focus on meeting that time allotment, not what they’re achieving with it.

If something appears in production that didn’t meet our standard of quality, we now have a paper trail to follow to see what went awry. Was the AC poorly written? Were the user stories incorrect? Was the ticket accepted when it shouldn’t have? Was someone left out of the review process? These are all simple things to identify and fix for the future.

The other data point this method creates is tracking the amount of times a ticket goes from product review back to in-progress. The fewer times a ticket is rejected is a key indicator that the EDP team is aligned and working in sync. If tickets are often rejected we can assume the team is not aligned and can quickly address it.

At InVision, for the teams that have adopted this process, visual bug tickets are nearly non-existent and teams operate more efficiently. Being disciplined in writing good AC gives them a understanding to coalesce around, brings clarity to the why behind the what, reduces the amount of rework, and improves the quality of their output.

In my experience, missing the standard of quality is nearly expected in a rapid growth company. Things change quickly and you’re often racing to herd an ever-growing team in the same direction. Utilizing a process like the above relieves some of that pressure. It scales well, provides data for real-time assessment, and spreads out responsibility and ownership. But we never would’ve arrived at this solution if we didn’t first take the time to investigate and discover that the problem didn’t exist at the end of the process, but at the beginning.

--

--

Design Systems consultant. Previously InVisionApp, Etsy, Mailchimp. Owner of Methodical Coffee.