Thoughts on consent & ethics in tracking user behaviour for UX Research
As a UX designer and researcher, I’ve been paying close attention as user and customer privacy on the web becomes a bigger part of the public consciousness.

This article is an exploration of recent events shaping the context of user privacy on the web, how this relates to the tools we use, and my thoughts on consent and user privacy in relation to UX research.
TL;DR: I believe that some of the tools we use in UX research need informed consent from our users, and it’s up to us to determine and apply ethical standards to our work.
Recent Events that are Shaping the Public Consciousness Around User Privacy
Cambridge Analytica + Facebook
It’s been all over the news in connection with the U.S.A. elections and Russian interference.
Cambridge Analytica is a political consulting firm that preyed on a loophole in how Facebook’s API handled app permissions and data sharing options to build a vast, detailed database of up to 87 million Facebook users. This was used as a basis for targeted advertising and influence during the Trump presidential campaign.
To build this database, researcher Aleksandr Kogan, a Russian American who worked for the University of Cambridge, created a Facebook app that let users complete a quiz.
The 270,000 users that took the quiz granted access to their personal information, which was captured and stored. Crucially, when a user granted these permissions, the app also gained access to each of their friends’ personal information. These users never used the app, nor consented to the harvesting of their personal information.

This information, as we now know, was likely used to influence the outcome of the U.S.A. election.
Facebook’s “shadow profiles”
This isn’t the first time Facebook’s activities have raised questions around privacy.
While we don’t have a lot of hard facts about this yet, it’s pretty clear that Facebook creates “shadow” profiles of users who aren’t currently Facebook users.
While testifying before Congress, Facebook CEO Mark Zuckerberg tactfully avoided explaining exactly what goes on behind the scenes. The gist of it: Facebook is able to track individual unique users across websites, whether or not they are signed in to Facebook or have an account at all, and they use this tracking to build a representation of who you are.
This is tied to the “Facebook Pixel” that product and website owners install to get more advertising and tracking capabilities. When you visit a website that uses this Pixel, Facebook will associate that visit with your Facebook profile — or with your “shadow” profile.
If you then sign into Facebook, it’ll associate that data with your Facebook profile. When you sign up for Facebook, it will enable that shadow profile as your active account, along with all data associated with that profile.
Equifax and other major user data leaks
Of course, it’s not just Facebook that’s found itself in the news for privacy concerns.
In the past several years, many big-name companies have experienced data breaches, including — as just a few examples — Yahoo (who exposed the personal information of 1.5 billion users), AdultFriendFinder (412 million users), and eBay (145 million users). Most recently, Equifax, one of the major credit reporting agencies in the U.S.A., reported that the personal information, including names, addresses, and Social Security numbers of 147.9 million Americans (44% of the population )— was exposed. Canadians didn’t escape unscathed: at least 100,000 Canadians were exposed as part of the breach.
These breaches are no longer anomalies. Again and again, we see that companies are collecting massive amounts of data on their users and customers — and that we can’t trust this data will be secure.
User Tracking in UX Research
These examples aren’t intended to draw a false equivalency: rather, they provide context around why data privacy on the web is becoming a bigger, more public, and more important issue that affects everyone.
As a UX professional, I don’t use Facebook to perform UX research. I’m not responsible for data and infrastructure security. But, as part of my work, I use many tools and platforms designed to track, store, and analyze user behaviour and activities.
These tools and platforms include: Google Analytics, Mixpanel, Hotjar, Fullstory, Crazyegg, Usertesting.com, and more.
Each of these tools has a range of capabilities when it comes to user tracking, which may include:
- Capturing user demographics and identifying information, including information such as the user’s country, city, web browser, etc.
- Tracking clicks and events.
- Tracking user activities over time to visualize user flows through a product.
- Generating heat maps using aggregated user cursor movement and click activity.

- Live recordings of users’ activity.

When does tracking for UX research cross the line?
I’m okay with some kinds of tracking while I’m using a product, and much less okay with other kinds of tracking. But where do I draw the line?
I’m comfortable with a product having access to my general demographic information that can be determined through my web browser, or that I’ve chosen to share with a product as part of my user profile. (Sharing data across platforms, like Facebook, is another story).
When it comes to my activities and interactions, I believe there are two broad categories: “interactions,” and “interstitials.”
Interactions are activities that I deliberately perform, such as clicking (or tapping) a button, hovering over a dropdown menu, filling out a form field, or submitting forms.
Interactions are inherent to using a product. When I interact with a product, I’m expecting the product to receive and act on it in some way.
Interstitials are the moments between my interactions. This is when I’m moving my cursor, filling out a form field, or anything that bridges my intent with action.
I’m not comfortable with the idea of tracking interstitials. To me, this feels like an invasion of my own “mental space.” It takes advantage of the fact that users are not always consciously aware of which activities can be recorded.
When was the last time you used an online help chat?
How would you feel about knowing that the helpdesk technician you’re chatting with can most likely see everything you enter into the message field, while you’re typing— not just after you’ve sent it?

Tracking interstitials feels — to me — like the equivalent of these chats.
Researchers from Princeton agree with me.
… the extent of data collected by these services far exceeds user expectations; text typed into forms is collected before the user submits the form, and precise mouse movements are saved, all without any visual indication to the user.
Of the tracking techniques that I have access to while doing UX research, I can see where the ethical lines begin to blur.
- I’m comfortable with tracking demographic information related to my browser or user account, my clicks, or my activity flows.
- I start to feel uncomfortable with heatmaps that visualize aggregate user activity and cursor positions, if one can identify unique individuals.
- I’m not comfortable with live recording of user activity within the browser, particularly in a digital context that operates on implied consent.
Ethics and Consent in Scientific Fields
In the domains of medicine, psychology, and social science, informed consent is a cornerstone of ethics.
As summarized by Renard Sexton:
Informed consent, in short, is a process by which a researcher provides the necessary information to a subject about the nature of study such that the subject can competently decide whether to participate or not.
In medical studies, this is pretty straightforward: researchers provide patients with the best information they have around risks and potential benefits, and individuals are able to make a decision about whether or not to participate.
Psychology studies are very similar. The CPA Code of Conduct for Psychologists requires that researchers must:
Provide, in obtaining informed consent, as much information as reasonable or prudent individuals … would want to know before making a decision or consenting to the activity. Typically … this would include: purpose and nature of the activity; mutual responsibilities; whether a team or other collaborators are involved; privacy and confidentiality limitations, risks and protections; likely risks and benefits of the activity, including any particular risks or benefits of the methods or communication modalities used; alternatives available; likely consequences of non-action; the option to refuse or withdraw at any time, without prejudice; over what period of time the consent applies; and how to rescind consent if desired.
I try to follow these ethical standards and make sure I obtain informed consent when I carry out remote or in-person usability studies. Participants are aware that they are participating in a study, what kind of data will be captured and how it will be used and stored, and they’re able to opt-out at any time.
Unfortunately, the web operates on implied consent.
The web operates on implied consent
Companies draft detailed privacy or data use policies that mention what data may be collected and how it will be used. If you, as a user, don’t want a company collecting this data, your only option is to opt-out of using the product — if you’re even aware that your data is being collected in the first place. Otherwise, your consent is implied.
In 2014, Facebook (again) carried out a study wherein they manipulated 689,003 user’s emotions by adjusting what appeared on individual timelines, based on positive and negative emotional expression, and demonstrated that this had an emotional contagion effect that influenced the user’s own emotions.
In this study, users never provided informed consent. Researchers “took advantage of the fine print in Facebook’s data use policy to conduct a scientific experiment without informed consent,” which was, essentially a loophole that relied on users not reading or understanding Facebook’s data use policy.
In 2011, in a push to advocate for user privacy, the U.K. rolled out the Privacy and Electronic Communication Regulations (PECR) Act, legislation specifying that websites serving users from the U.K and E.U. must be notified when websites store cookies and perform “non-essential tracking.” This was originally drafted with the goal of requiring informed consent from users.
The rollout of this legislation was thoroughly bungled: most U.K. government websites weren’t ready when it came into play, meaning that they would fall afoul of the law.
A week before the legislation kicked in, it was updated to allow for implied consent: all a company needed to do to comply was show the user a message about the use of cookies, and all a user had to do to consent was continue to use the website or product.
Privacy legislation is getting better
In contrast to PECR, the upcoming E.U. General Data Protection Regulation (GDPR) advances much stronger legal requirements around collecting data and user privacy. This legislation comes into effect on May 25, 2018.
Of note:
Transparency. Data processors and controllers may only process an individual’s data if they have first informed the individual of the extent of the data processing and the uses to which the individual’s data will be put. (Recital 39) Specifically, data processors and controllers must inform data subjects of:
• The identity of the data controller.
• The specific purposes of the data processing, which must be “explicit and legitimate and determined at the time of the collection of the personal data.”
• The period for which the personal data will be stored or, if that is not possible, the criteria used to determine the retention period.
• The right to withdraw consent at any time.
• Data subjects’ rights to obtain confirmation regarding their personal data that will be processed, including the right to access, correct, or erase personal data.
• The risks, rules, safeguards, and rights in relation to the processing of personal data.
• How they may exercise their rights regarding the processing of their personal data.Plain language. Any information and communications regarding data processing must be “easily accessible and easy to understand,” and “clear and plain language must be used.” (Recital 39) Similarly, consent provisions cannot be “buried” in another, longer document
Affirmative act. Consent must be “freely given, specific, informed and unambiguous.” (Recital 32) The data subject must signify his or her consent through an affirmative act, such as by signing a written document, checking a box on a website, or other action that clearly demonstrates the subject’s intent to agree to the data processing. It is not appropriate to set up an “opt-out” system whereby the data subject has consented to the data processing unless he or she takes an affirmative action to show a lack of consent, as “silence, pre-ticked boxes, or inactivity” does not constitute consent.
(Source.)
Unfortunately, this legislation doesn’t necessarily affect Canada or the U.S.A., and it’s not all that clear how it’ll affect UX research.
That’s why we have to consider and apply our own ethics around consent in UX research.
Implied consent isn’t good enough for UX research
I don’t believe that implied consent is good enough when it comes to UX research.
When working in UX, one learns that users don’t read. They’ll skim content, jumping from headline to headline. They’ll dismiss modals and messages without reading them, and ignore the blindingly obvious in favour of the task they’re trying to achieve.
This implies that few users will read notices about the use of cookies on a website, let alone dive into lengthy legalese privacy and data use policies — if they even understand the implications of what they’re reading.
PECR’s ineffectiveness is somewhat unavoidable: cookies are essentially required for most products to work at all. If the legislation hadn’t been adjusted at the last moment to allow for implied consent, it would have had massive financial, economic, and social implications. And, we can’t legislate that users inform themselves about how it affects them.
But while cookies may be required for many apps to work, there’s absolutely nothing that requires real-time recording of user activity. It’s a business decision. Products will still work without it.
Informed consent should be a requirement for UX research
If we choose to track user activity, users should be able to choose whether or not to participate. Informed consent should be a requirement for certain types of tracking for UX research.
I think the tipping point is when we’re capturing interstitials.
Any time a user opts in to tracking, it’s our responsibility to make sure that users, as the CPA guidelines state:
… were provided as much information as reasonable or prudent that they would want to know before making a decision or consenting to the activity.
In the context of UX research, that means making sure that users understand things like:
- Their cursor and scrolling movements can and will be captured while using a product;
- When and where data entered into forms and fields is captured;
- How data is stored, used, and shared within and outside an organization.
What would informed consent look like for UX research?
I imagine informed consent for UX research will end up similar to how email subscriptions are handled today: granular control over opting in or out of individual types of emails.
There’s too much information users need to know to be informed to be included in a modal, and it’s unrealistic to ask anonymous users to provide consent. When our products have user accounts, I believe there’s an opportunity to ask users to be collaborators in our research, and through this, receive informed consent.
This is a quick exploration of how this might be handled as part of user settings. Note that it’s very incomplete, but it gets the idea across.


(Quick note: based on research performed by Johnson, Bellman, and Lohse, we know that offering defaults is a powerful influence on choice. We can likely be more confident that users are making active, informed decisions to opt-in to UX research activities if we avoid providing defaults.)
I suspect that if companies were required to obtain informed consent for UX research, it would result in most companies being much more cautious about what types of tracking and data they perform and collect.
Many companies would choose — whether due to public perception, lack of resources, or differing internal priorities — to cease tracking user behaviour altogether. Indeed, in response to the Princeton study, Walgreens decided to stop sharing data with Fullstory.
We Have to Define Our Own Ethics
As UX professionals, we still don’t have a clearly identified, universal code of ethics. While the GDPR is an excellent step forward from a legal perspective, it’s only applicable to products servicing the E.U., and it isn’t so clear to me how it will affect UX research.
That means that today, it’s up to me to define my own ethics (and you to define yours, and your company to define the company’s). I have to draw my own ethical lines in the sand.
I will not use tools that cross that line. For me, that includes any form of real-time recording of user activity in the browser when users haven’t explicitly provided informed consent.
This comes with tradeoffs. It means I have less data to act on without jumping through hoops to obtain consent. It makes it more expensive and time-consuming to collect new data, because it obliges me to design and recruit for usability studies on an individual basis.
But, there’s an upside.
Defining your own ethics and transparently communicating what you will and won’t do to your users can be very good for your brand. It can be a competitive advantage to be known as the company that advocates for user privacy — just ask Apple.
I also believe there are other people reading this who agree with what I’m saying.
Those are the people I want to work with — and together build great products. Those are the companies whose products I want to use.
Thanks for hitting the 👏 if you enjoyed this article! It’ll tell me to write more things like this. Real-time user feedback with informed consent.
Quinn Keast is a UX Designer + Partner at Caribou, a user experience strategy and design consultancy in Winnipeg.
—
If I have made any errors or omissions in this article, please let me know.