UX Auditing: A Guide to Empowering Design Decisions

Iris Sprague
UX Collective
Published in
7 min readAug 26, 2018

--

Last summer, I began interviewing with a startup who was seeking their first Product hire. When I was interviewing with them, I was impressed by the fact that they hadn’t yet hired a full-time designer. Their new platform was nice with lots of white space and clean navigation.

Every designer’s dream, right?

After I got the job and started using the product, I realized many issues lay below the surface. There were fundamental UX problems, too many unexplainable features, and no apparent design system.

While the executives claimed this was their MVP, they were competing in a saturated market. When there are at least five other companies competing against you, you need to stand out.

It’s not enough to build a product that has basic functionality. You need a product that has a great user experience and keeps users coming back. You cannot survive, B2B or B2C, without engagement from your end-users.

That’s where the MAP comes in. MAP stands for Minimal Awesome Product. Read this great article to learn more about it.

It was clear that my current company needed to move from an MVP to a MAP.

They needed a UX audit.

The UX Audit

A UX audit is a great way to see what’s working and what isn’t.

  • Where are users getting stuck?
  • Where are they dropping off?
  • Why are they dropping off?
  • What do they not understand?

An audit can answer all of these questions and more. But how does one conduct a UX audit? That’s what I’ll be discussing in this post.

Find the right people

It’s crucial to get in front of your users. Your customers, directly and indirectly, will reveal where things are going wrong and why.

With that said, I was in a bit of a bind. The new platform only had a handful of users and no user data. The legacy platform had a completely different user experience, making its data irrelevant. My first step was finding out where to begin, so I decided to go to the source.

With any product, if the people that are building and selling it don’t understand it, there’s no hope that a user will. From my experience working in B2B SaaS startups, talking to the sales and customer service teams is imperative to understand a buyer’s perspective. For more information on cross-department collaboration, read this article I wrote on the topic.

For my audit, these were the key persona’s that I wanted to interview:

Internal Employees

  • Customer Service Manager (CSM):
    To understand how the product is meeting the expectation of the buyer.
  • Sales Rep (Account Executive):
    To understand the stories and solutions that get buyers initially interested in the product.
  • Engineers:
    To understand how the people who are building the product feel about it.

Customers

Best generic image of “business people”
  • Admins (Buyers / Users):
    To understand the pain points or jobs-to-be-done from an administrative/ company perspective.
  • Employees (End Users):
    To understand the end user's pain points.

Conducting user interviews

The set up for all these interviews should be the same. I used Steven Krug’s Usability Test Script. This is important for many reasons. You want users to know this is a safe space. It’s okay for them to be confused and not know where a button is. You don’t want your users to feel stupid.

Getting people talking wasn’t an issue. I used an interview method called contextual inquiry combined with a think-aloud protocol.

All I had to say was “What do you think of this experience?”. My interviewees would take care of the rest. They would tell me everything that they had on their minds. I was their product therapist.

Be deliberate in which questions to ask. When you are asking users about a certain feature, remember to not lead their answer. Saying things such as “Did you think that feature/flow was intuitive?” will guide how users will respond. Instead, ask “How was that experience for you?” You will get richer data if your questions are more open-ended.

Other aspects to consider

Sample size: When it comes to determining how many users to interview, it's beneficial to have a large sample size. Doing this will qualify that a problem is universal. The more data points you have, the easier it will be to spot trends and patterns. There should be no more than five users per persona at a time.

Roles: In our interviews, everyone had a role based off of IBM’s research principles. I acted as the “guide” or facilitator. I always tried to bring in one or two people to be “explorers” or observers. Developers make ideal observers. It’s important for them to witness a user’s pain points first hand. It builds empathy and helps the engineers understand the impact of their work.

Interview Medium: We used Zoom to conduct and record the interviews.

Analyze the data

After 13 interviews, I was sitting on a mountain of data. This was the time to sift through it all and bring back my findings to the executives. When analyzing a obscene amount of data, it’s crucial to use a system to organize your insights. The system I used was affinity mapping.

Affinity mapping refers to organizing related facts into distinct clusters. I broke down my data into groups, each of which correlated to a major feature in the application. You can break down your audit by verticals if that’s how your company runs. You can also divide into categories such as UX or Marketing Copy.

With my data sieved into categories, I began marking down how many users or companies had mentioned a particular insight. The more users you map to a particular problem, the more validation you have that this needs to get fixed.

Using my affinity map, I created thorough reports detailing my findings. Since the executives were the intended audience, I made sure to use direct quotes to back up my findings. Using the users’ own words will have the most impact on those you’re trying to persuade.

Speaking of executives, they’re busy people. It’s important to include an executive summary at the end of a report.

Google docs of Analysis from the audit. Due to sensitive information, I can only show you so much :P

As the name implies, keep the executive summary short. No more than a page. And always start with the positives. You don’t want to tear down everything they’ve accomplished so far.

Getting Executive Buy In

It took some work to get the C-Suite on board. C-Suite backing is crucial to any successful UX Audit. Even the best ideas will end up in the trash if there isn’t sufficient support. Executive’s don’t need to be 100% committed, but they need to be open to the suggestions.

That’s why execution is everything.

You have to sell the story of why a feature or design isn’t working. Emphasize how users are currently struggling. Explain how updating a features aligns with the business strategy.

Executives understand the market but you understand the product. Paint the picture of how design impacts the business.They may not realize the impact design has until you frame it in context of business goals and objectives.

If all else fails, bring in your bosses to observe a user interview. I had one of the executives sit in as an observer to witness how users were reacted to the product. Hearing critical feedback from customers will be eye opening. If your executives still aren’t convinced after all of the above, you may have bigger issues than a sub-par product.

With the executives on board and my plan in place, I was able to conclude the audit as a success.

The feeling you get when you get executive buy in

Final Note

When conducting your UX audit, put into consideration the politics of your company. Communication is your nail while data is your hammer. Combining them both is how you create impactful change.

Thanks for reading! I hope you enjoyed this guide to UX Audits. If this article resonated with you, please leave some 👏 and let me know how you conduct your own user interviews. I’d love to hear about it!

--

--

Designing beautiful and scalable design systems. Product designer @Google