How to read nonverbal cues during UX research

Body language plays a crucial role in understanding user behavior.

Matt Gramcko
UX Collective

--

Source

I’ve already talked about how important it is for user researchers to control their own nonverbal behavior in creating an ideal environment for interacting with users. By also paying attention to users’ gestures, facial expressions, and movements, the researcher can resolve any inconsistencies they exhibit in conversation and paint a more accurate picture of the user.

The challenge for UX researchers and designers is creating the most precise picture of the product user, in the most efficient way possible. Conversation is both a powerful and efficient way to learn about another person. Yet, when taken at face value, it is often unreliable and too subjective. To put it mildly, people can’t be trusted to represent their own opinions or behaviors accurately.

The tried and true rule of usability is to “pay attention to what users do, not just what they say.”

Although Nielsen Norman Group made this quote about how usability research should focus on watching users perform tasks, I believe this rule should also apply to the implicit, nonverbal behaviors that frequently go unnoticed. Whether it’s a user interview or usability test, empathy is the result of a holistic account of a person’s behavior. By learning to pick up on clusters of nonverbal cues from participants, the researcher can get to the heart of the issue or insight faster and more accurately.

Some behaviors to look out for

Pay attention to stances. What stance do users take when unwilling to be open about themselves or provide more information during a session? Crossing their arms or legs forms a defensive barrier between themselves and others, and they may keep their palms closed and faced down.

Solution: By recognizing defensive poses, the researcher can prime them into a more open position with open palms (this is easier when there is something to hold). Once they change their posture, users are more likely to provide more helpful feedback.

Averted contact. Face to face contact is so important in testing and interviews. Just as eye contact shows an interest, when a user gazes away from the screen, they may feel deceived, ashamed, confused, or even bored with what is in front of them. When hearing or sharing lies, aversion comes in the form of covering the eyes, ears, or mouth with the hands.

Solution: Consider rephrasing questions differently to see if the same answer is achieved. Seek clarification when body language tells you otherwise. Read back the user’s earlier response and follow up with “Can you tell me more about that?” or “What did you mean when you said…?” If the response does not match, there is a good chance that their gestures were revealing the truth about their lies.

Smiling. A smile may be a sign of satisfaction or it may be revealing something more. The tight-lipped smile — when a person’s lips are stretched tight across their face in a straight line concealing their teeth — the user may be reluctant to sharing the whole truth. Participants who deliver a drop-jaw smile or lack crinkles around the eyes while smiling might not be genuine. These are common fake smiles people use to get others to like them.

Solution: You might want to reassure them that they should be honest or ask: “Is there anything else you would like to share that you haven’t already mentioned?”

Check the brow. When users raise their brows, it can be a sign of uncertainty, fear, or surprise. While surprise isn’t always negative, we don’t necessarily want our users to be surprised or uncertain of the experience on our platform. A furrowed brow shows stress.

The mouth. Expressions involving the mouth can convey frustration and confusion: compressing the lip, moving the mouth, or vocal cues like sighs and irregular tone can be signs of frustration and confusion. I see this a lot when a user intends to do something, but it does not work, causing frustration and anxiety

Hand touching the face. If a user is touching their face during the interview, they could be tired, confused, or even nervous. This can also indicate a high level of concentration and frustration with a task.

Leaning. Backwards or forwards leaning can indicate a negative emotion or even a high level of frustration. The user may be ready to give up on a task or experience.

General solution: Depending on the situation, a simple “what do you mean by that?” or “what are you trying to do?” can help dig a little beneath the surface and offer enlightenment. Maintain distance from the product or subject and probe for honesty.

So what are these really saying?

  1. You can use heightened awareness of body language to ask questions and probe further about what the user means or is trying to accomplish.
  2. It’s the practice of observing behavior over words. Nonverbal behavior can give a clearer indication how someone actually feels at any point.
  3. Trends among users’ body language may provide additional insights or reinforce other insights. If a negative trend occurs during a particular task or flow, you can prioritize that issue.
  4. If negative body language exists outside of what is being tested, you can use these to indicate where improvements can be made to the overall user experience of the product.
  5. Nonverbal acuity provides a clearer indication of the participant’s overall experience as positive or negative.

Video recording is a great way to capture these expressions later on and add context to user responses or decisions. But I would argue the best way to capitalize on these is being privy to them during the interview. This way, you can ask follow-up questions and know when to question their verbal responses. The flashes of concern, stress, or confusion that appear across their face may suggest they may have more to share.

In general, these behaviors range far beyond how much can be fit into this article — but it’s not important to look out for every single one, but rather be aware of the significance of behaviors like these and realize their implications for your report. There may be a participant who doesn’t display many expressions or displays too many to count. Reading body language won’t be the answer to all your questions, but they are another tool to add to the UX research tool belt.

And, of course, it is important to be aware of your own nonverbal cues as humans tend to mirror each other when we’re feeling a connection. Having an understanding of nonverbal cues ultimately allows the researcher to build a better rapport with users, stakeholders, and teammates — and achieve a deeper level of empathy.

--

--