Contextual Inquiry: 8 typical patterns and issues

8 common patterns, behaviours and issues often seen during a contextual inquiry, with a focus on for enterprise software services and culture.

Rik Williams
UX Collective

--

An enterprise call centre… a crucible of CX and UX!

What is a contextual inquiry?

Contextual inquiry, also known as site visits, or contextual interviewing, is a powerful method to ‘narrow the gap with reality’ when understanding, or making design decisions. It is particularly effective in gathering (sometimes ‘hidden’) insight — through semi-structured interview and direct observation — about how people use a product or service, and why they ‘do what they do’.

There is lots of information about how to set-up and run a contextual inquiry, but not so much on what to look for. That’s why I made a simple contextual inquiry crib-sheet [Google Doc] to help members of my team run a contextual inquiry at Alzheimer’s Society. I’ve continued, mining my contextual inquiry war stories, to extend and expand on 8 typical patterns to watch out for…

CONTENTS

  1. Context of use
  2. Pain points and known issues
  3. Repetition and duplication
  4. Physical items
  5. Fixes and hacks
  6. Intersections and junctions
  7. Fluctuations and typicality
  8. Software and applications

Context of use

Comprehending the environment in which people use the service/system is the foundation of a contextual inquiry. Moreover, the effort you make to visit where the product is used will be repaid in:

  • an illuminating understanding of the subtleties of the actual, lived, user experience,
  • comprehension of the factors adjacent to the system, which might affect its use,
  • lasting goodwill from the people who might be suffering from shoddy UX design day-to-day.

There are many areas to look out for, including…

EXAMPLES

  • how busy, noisy or quiet is the workspace? How does this affect the work?
  • what kind of hardware/software so people have? Will this impair usability?
  • how is the space designed? Is it easy to access everything needed in a task?
  • is the work only conducted in that space, are there others? Which, where?
  • what effect does culture have, if any, on the service/process/system?

PROTIPS

  • consider making a sketch map of the workspace, including any resources needed/used,
  • physically follow a record/job on its journey around the office (where appropriate),
  • observe how the team works together (or not), especially if there is a problem.

Pain points and known issues

Find out what known issues people currently experience. These will include those encompassed by the system itself, but should not be limited by it. For example, consider the effect of the design of the workspace and parts of the process that happens elsewhere, beyond the system.

These non-digital issues can be useful to help understand the total user experience. They may also affect future versions of the system if the issues can be resolved (or mitigated) with software.

EXAMPLES

  • where do people feel they are wasting too much time?
  • what parts of a process are repetitious, frustrating or farcical?
  • how easy is it to access effective help and support? How often and for which issues?

PROTIPS

  • are there support logs and previous reports you can access and analyse?
  • what can you learn from business analysts, developers, trainers and product owners?
  • is it possible to access the system, for expert review, or to use in anger?
  • what’s the wider context, like the design of the workspace and activities beyond the desk?

Repetition and duplication

Call notes (obfuscated) for 3 patients. These need to be re-captured, post hoc, into a customer record management (CRM) system.

In an ideal world, a person would only need to complete a task once to achieve the desired outcome. However, this supposes that systems are designed with people’s needs, capabilities and context in mind! Oftentimes people may be working across systems and processes which may not be optimised for the task and their workflow. This may lead to…

EXAMPLES

  • administering data, or steps in a process, more than once,
  • managing duplicate systems to achieve a single outcome,
  • failing to maintain more than one system or dataset.

Physical items

Post-It Note crib-sheet detailing a keypad combination to transfer calls between employees.

Do people create crib-sheets, documentation or other items to help them understand and use the system or sub-processes? Try to track these down and consider whether it’s possible to re-design the system so that these are no longer needed (or if they should be added as new functionality).

EXAMPLES

  • use of crib-sheets, like Post-It’s stuck to monitor bezels,
  • additional documentation produced locally by the team to support their work,
  • any anecdotal evidence about system/process perceptions, adoption and use,
  • (once I worked a team who had made an org-chart which also detailed who was grumpiest with the system under inspection),
  • aspects of hardware, like small monitors or missing, broken or inferior equipment.

PROTIP

  • come prepared to take photos, or make copies, or any artefacts uncovered.

Fixes and hacks

A university call centre worker using Google calculator in the browser. This is because the system couldn’t automatically calculate a result.

Look out for people compensating for shoddy service or workflow design by devising and using fixes, hacks and workarounds. When uncovered these can inspire design changes, either through new functionality or by eliminating problem areas.

EXAMPLES

  • using additional software/hardware to meet a common need during a task,
  • creating parallel documentation to detail how parts of system work,
  • following superstition about how the system needs to be used so that it ‘works’,
  • creating convoluted pathways through a system to avoid a specific area/function.

PROTIPS

  • be prepared to make/export screen recordings and screenshots,
  • look out for things which you could add/remove in a future version of the system.

Intersections and junctions

Locally made print dementia diagnosis referral forms. These were in addition to the centrally produced ‘definitive’ digital form.

Identifying parts of a process where work meets another system, role, or culture is a great way to build understanding and uncover issues. Real-world intersection points can illuminate differences/opportunities in working practices, technology and system feasibility.

EXAMPLES

  • a digital process that meets print media (and vice-versa),
  • the transfer of information between organisations (or teams),
  • the confluence of two (or more) systems in a process,
  • diversity, rather than conformity, in individual ways of contributing to case/piecework,
  • where a new way of working meets an old way of working.

PROTIPS

  • explore the space, be nosey, eavesdrop, ask questions,
  • look for print media, especially forms and tabular information.

Fluctuations and typicality

Make sure to check if the activity you are observing is typical and, if not, how come? Similarly, probe about what happens when things are especially hectic, quiet and if there are any changes throughout the year.

EXAMPLES

  • what happens at peak times (and vice-versa)?
  • is there any cyclicality, perhaps seasonal or industry specific? What happens?
  • are you witnessing a standard day, workflow, rate and patterns? If not, how come?

PROTIP

  • access usage logs, like Analytics, and look for peaks, troughs and cycles in activity.

Software and applications

Using Google Chrome because the approved browser (Internet Explorer) kept crashing.

Understanding what software, including websites and enterprise applications, are used is essential to understand the system/process. It’ll also likely become a foundation for discovering pain-points, fixes/hacks and intersections.

Where possible become familiar with the system/process, perhaps by attending a training session. Also be prepared to take screenshots of anything of note uncovered during a contextual inquiry.

EXAMPLES

  • what sites and apps are actually used (vs. prescribed)? If different, why?
  • which specific interfaces are being used, and which order? Are they working well?
  • are the whole team using the same tools for a task? If diverse, why?

Summary

I hope that the contextual inquiry crib-sheet and these 8 typical patterns/issues are useful for you. Have you been using contextual inquiry in your user research? Have something to add? Let the world know in the comments!

--

--

Content Strategist at Government Digital Service (GDS). Inclusive UX Research, Information Architecture, Content Engineering. rikwilliams.net