Notable notes in user research

Make your user research easier through efficient note-taking.

Anna Marie Rosická
UX Collective

--

A single empty post-it with two different pens above it, computer keyboard and a mouse.
Photo by Kelly Sikkema on Unsplash

Let me start with a small confession. If there's one thing I hate about my work as a user researcher, it's note-taking. By note-taking, I don't mean the creation of meeting minutes or some fancy techniques of building an external brain. I rather mean the processing of user interviews into notes for you to later analyze as part of your research.

As you may know, there are more exciting ways to spend your afternoon. The insights you get in the end are eye-opening but note-taking itself tends to be lengthy, boring or even slightly painful. Sometimes, you spend hours re-watching a video recording minute by minute, trying to decipher what it is your participant was trying to do. Other times, it can feel like going back to school, trying to take notes of a mathematics lecture by a teacher who drank too much coffee.

So I confess: Note-taking is my kryptonite. I suspect I'm not the only one to feel this way. Naturally, as a researcher, when I decided to overcome this challenge, a part of the solution was a bit of research. Let me walk you through what I found. I’ve gone through some of the painful experiences for you so you can just sit back and learn from my mistakes. And maybe… take notes.

Are notes evil?

Many people see note-taking as a necessary evil that they want to just fly through. It took me a while to realize that note-taking, actually, is a core user research skill — just like interview facilitation, data synthesis or presentation skills.

The thing is: You don’t usually do research with raw video recordings and then — ⭐️MAGIC!⭐️ — all of a sudden pull a list of user insights out of your sleeve. The path from user interviews or observations to research results is usually a bit longer. Note-taking is a part of this path. It is a crucial part of the research you do.

“But what about automated interview transcripts?” I hear you say. “I can just feed the recording into my transcription tool, edit out all the weird errors and work with what I get, right?” — Right. You can. Actually, an interview transcript is just a detailed form of notes. It captures what was said, not what the participant was doing. A whole hour of uhs and you-knows and lots of detail. Sometimes, this is exactly what you need to reach your goals. But definitely not always.

The purpose of notes is to capture observations and insights in such a format that can help you reach your research goals.

Think of your desired destination first. Then pick your mode of transport.

Should you get thorough, verbatim transcripts or short bullet points? Which is better, chronological notes or notes arranged by topic? And should you focus on actions, emotions or the participant's choice of words? All these decisions should be based on where you're trying to get. And it is a good idea to think about your note-taking strategy already when planning your research.

A hand holding a pen and writing a checklist in a notebook.
Plan your attack. Photo by Glenn Carstens-Peters on Unsplash

Just enough notes

Unless you live in an ivory researcher tower, your time, energy and resources are limited. You only have a certain amount of all three to spend on note-taking of any form — but at the same time, you often cannot really do without notes. That means you also want to make sure your research notes are as good as you need them right away. What does this good look like?

Good research notes are traceable. You need to ensure that you — or any other authorized person — can always trace back your research results to the raw data. This ensures that your understanding of the users is unbiased and truly anchored in data. After all, human memory is fallible.

Good research notes are also future-proof. Usually in your research, you focus on a specific area of the product or a specific topic. However, during your sessions with participants, you come across things that might be useful later on, which don’t make it to your research report. Your notes might come in handy for future analysis.

Last but not least, good research notes enable you to share what happened in a research session with other researchers or colleagues. They can then quickly scan through your notes without having to re-watch the whole session recording and still find the information that they need.

So that’s the theory. But things can get notably out of hand.

Let's move on to my curated fail compilation.

Fail 1: Relying on recordings

An unfortunate mistake would be to rely on recordings too much. The obvious reason you may think of is that recordings fail. You may call yourself lucky if you’re even permitted to record your interaction with the user. But maybe your tools are reliable and your research participants friendly for the most part. Is the risk still worth taking?

The less obvious problem with relying on recordings is that it can make you waste time. Recordings take just as long to listen to — or sometimes even longer — than the original session. And even with 5 to 10 interviews for a small research project, this can be quite a lot of time.

Re-watching the recordings and taking notes only afterwards used to be a common approach in our UX team. When I was recently tasked with usability testing of our whole product as part of a redesign, I didn’t have much time at my hands and I wanted to save some.

So I decided to get at least some notes done on the go.

While doing it alone is certainly an option and many researchers don't even have another choice, I knew I wouldn’t be able to facilitate the usability testing and at the same time take detailed notes. I decided to get help. Luckily, there's an established practice at my company to have a developer and a UX designer present at each call. I asked my UX colleagues to try and take notes on the go.

An alternative approach would be to involve all observers in some sort of group note-taking. That way, you can engage your stakeholders more and leverage all the eyes present in your user sessions. Nielsen Norman Group have a great tutorial on the topic.

Let me just highlight one thing: While note-taking on the go can be a great way to speed up your research, that does not mean it is wrong to take notes “in retrospect”, based on recordings. Take my advice as an optional life hack rather than a law. Remember: You've always got to find out what works best for you and your research goals!

Fail 2: Lack of training

So you found a way how to capture what users say and do. Now you've got the know-how – but what about your know-what? Whether you have someone take notes on the go, or whether it's recordings you call your best friend, it can be very hard to know what you should focus on. What kind of information do you really need?

As mentioned, in my usability testing sessions, I tried to have someone help me with the note-taking. The results were… interesting.

First, my note-takers suffered quite a bit. (This may be the place for me to say: Sorry folks!) Note-taking is cognitively demanding even if you have lots of experience with it. That's especially the case if you are trying to take notes of a usability testing session. Not all participants think as aloud as we would like them to.

Second, I found that there was quite a lot of variability in the format of the notes. Some people would go into more detail, some people into less, others would focus on the wrong details altogether.

I realized that there were two main mistakes that I made which caused this trouble. The first one was that I did not ensure that the note-takers were sufficiently familiar with the research questions and hypotheses we were looking at. For instance, a designer who had been involved in the preparation of the usability testing had a much easier job in this respect.

The second mistake I made was that I did not ensure everyone to be familiar with the note-taking format that I came up with.

As a wise guy once said…

Never spend 6 minutes doing something by hand when you can spend 6 hours failing to automate it.

So in my attempt to save time, I created a monstrous table for my usability tests. It was intended to save time when summarizing results. There was a pretty checklist to fill in, some space for timestamps so that the reader could easily go to the respective parts of the recording and so on. I was heavily inspired by the data-logging approach, described here. The result — in our case — was an unhelpful overkill, not very usable for my note-takers.

A printscreen from a spreadsheet table with the heading “Hypotheses — The user is able to…”. Below the heading, we can see the hypothesis “Create a new comment” and a piece of a very complicated Excel formula next to it.
The monstrous table at its best

Let me — once again — make a disclaimer. There's nothing wrong with asking for help, nor with monstrous tables, let alone with data-logging. But if you want those things to make your note-taking more effective, you've got to train your note-takers.

A train coming out of autumn woods.
Train. Photo by Balazs Busznyak on Unsplash

It did help in my case. When I had some note-takers join usability testing for the second or third time, I managed to get better results. Training definitely works – it just takes some time. That's a trade-off you've got to take into consideration.

Fail 3: Biased observations

A very, very common fail in note-taking is noting down biased observations. When people observe the world around them or listen to others, they are prone to various cognitive biases. We tend to take mental shortcuts that simplify the things we perceive. This tendency makes our brains more efficient — but it also distorts reality.

If you do user testing or user research, you probably have some previous understanding of the product or topic that you are asking about. The way you understand things impacts the way you perceive reality. Whether you like it or not, you will tend to mix up neutral observations or quotes with your subjective interpretations.

That's a huge problem. If you want your notes to be truly reliable, you have to make sure that you discern between those three. After all, it is your users you are interested in, not just your idea of them.

Let's take an example from the usability testing that I mentioned: A user was given the task to set up something called “Workflow steps” in our application. They found a button that filters by Workflow steps (that is, it does not set them up). What ended up in our notes is this:

A screenshot of a spreadsheet that states “Time in the recording” and also the note “she noticed the label “Workflows” in filters and tried to filter the items in Draft”
“she noticed the label “Workflows” in filters and tried to filter the items in Draft”

The note says that the user tried to filter something using the button. However, that’s not an observation; that's an interpretation! The user did not comment on what they were doing, they just clicked on a button somewhere. But because the note-taker knew what the button does, they wrote down the function of the button, as if it was the intention of the user.

A better way to do this would be to only describe the observable things: “The user noticed the label “Workflows” and then clicked on it, which filtered the items.”

“You” written in large letters and enclosed by square brackets
How can you unbias your notes?

Two simple ways to unbias your notes

The secret trick to unbiased note-taking is to put yourself into brackets. You can do that both metaphorically, as well as literally. This may sound quite abstract, but let me explain.

Let's start with the metaphorical brackets. A great technique to make your user research a bit less biased is common in the heuristic approach to qualitative research. It requires you to take a moment before talking to any research participants and to write down your previous understanding of the topic you're researching — anything you already know or assume. When you have your subjective understanding written down, it may be much easier to discern what are your interpretations of things and what is the understanding of your user — the thing that you're trying to get to, after all.

This technique is especially useful for exploratory research of user needs and context. However, it would probably not serve you very well in regular usability testing. Luckily, there are more literal ways to put yourself into brackets.

If you observe or hear something during the interview and catch yourself interpreting it, you can write down both what happened as well as your interpretation. The catch is to write your interpretation into real brackets.

Printscreen of notes saying: “[Anna’s comment: maybe he did not hover his cursor there long enough for the tooltip to appear. If he did, I have no idea what kind of further info he might be interested in here, I believe we’d have to follow up on what he meant if we want to address this :-) ]”
In my notes, I put my comments and interpretations into square brackets.

While this technique may sound obvious, you’d be surprised at how hard it can be for people to mix up interpretations with what really happened!

FAIL: Relying on recordings, SOLUTION: Get notes on the go. FAIL: Lack of training, SOLUTION: Train your note-takers. FAIL: Biased observations, SOLUTION: Put yourself into brackets.
Common note-taking fails and their solutions

Closing notes

Note-taking is an important tool in your research toolkit. Make note-taking considerations a part of your research planning. It will enable you to choose a note-taking approach that truly helps you reach your research goals.

I hope this article helps, too.

Good luck with your own kryptonite!

Note (pun intended): This article was adapted from an online talk held as part of the Kentico Design Hour. For more events and articles like this, subscribe to the Kentico Design newsletter.

The UX Collective donates US$1 for each article we publish. This story contributed to World-Class Designer School: a college-level, tuition-free design school focused on preparing young and talented African designers for the local and international digital product market. Build the design community you believe in.

--

--