User manipulation: thinking beyond dark patterns
Time to re-evaluate the intentions behind your designs.

As UX designers, we are responsible for the impact our actions have on the world. User experience practitioners can no longer hide behind Product Owners’ decisions, marketing briefings, or technical specs to offset our obligations towards the user's well-being and the ethical implications of our actions. If somebody gave the order to shoot and pulled the trigger, you are still responsible for the bullet.
For those designers in a leadership position, we have come very far to be at the table. We have fought every inch of our way to gain more authority, influence, and visibility in the decision-making that leads to the development of products and business value. With the benefits of this new position, we also gained the responsibility of addressing unethical practices and the repercussions of following or not our best judgment and ideals.
My goal is not dissecting dark patterns and their categories but looking beyond the user interface and digging deeper into the intentions that drive the overall experience. This could be the first time you have heard about a dark pattern, but I am sure you already have been tricked by them. This term, coined by Harry Brignull, is used to describe interfaces that are crafted so that the users are cheated, misdirected or blocked from achieving a reasonable goal.
Ethical Design will take a central place in the discussions of the role technology has on preserving (or eroding) human rights, radicalized social dynamics, and redefining personal locus of control over the user’s privacy. As a community, we need to be ready and actively promote those difficult conversations.
Design’s social influence
If you are part of a society, your simple presence is an act of influence to others. Design, by definition, is an act of positive influence. We introduce affordances to suggest how something needs to be used; we optimize flows to reduce user friction and favour practices guiding the user across various actions. Nothing inherently wrong with that.
You can argue that if a design does not intentionally influence the user, it just lost its reason to exist. When we talk about design principles like guidance, error prevention, and user empowerment, we are persuading the user towards his/her already established desired goals. Stewardship of the users’ objectives, mindfulness behind our business goals and awareness of our actions need to be part of any design review and feedback session.
Manipulation is the dark side of social influence. You can be a good or bad influence, but ‘good’ manipulation sounds off, right? Influence follows a well-structured conduct code between two entities; manipulation blurs those lines.
There are boundaries designers need to set in place to ensure that the power we exercise over the user has checks and balances embedded in the systems, practices and behaviours.
We need practical ways to detect manipulative practices and asses unethical behaviours. It is easy to get lost in the semantics of what is moral, ethical or legal. Are human values absolute or relative? Is free will a thing? In practice, this dialogue could be used to introduce confusion and wining nefarious arguments by technicalities.
Could you keep it simple? We are designers, not lawyers, always act in the best interest of your users and the common good.
Manipulation is a zero-sum game.
I am 99.9% sure you, my dear reader of this article, are a decent human being who considers manipulation a form distortion of the design principles and professional foundations that are a crucial part of working in UX. We can agree that our goal as designers is to empower the users by giving them a sense of control of their own decisions to achieve their objectives in the most efficient way possible.
It is effortless to do the opposite of what this article suggests and see it as a recipe for manipulating your users. Bad intentions could turn any warning about bad patterns into a tutorial. So before going any further and avoiding temptations, I have to say that manipulating experiences might offer some quick benefits in the short term, but they are not a long-lasting strategy.
Let’s say you have two competing companies in a very saturated market. “Company A” finds out that “Company B” has developed a new upselling strategy in their checkout that drives 20% more conversions. Company A imitates that pattern and also adds a dark pattern that forces the users to believe that the item they are seeing is the last one and the other 10 people are interested in buying it. Sales motivated by fake scarcity happen to increase the other 20%; Company B notices it, copies it and adds another shady UX scheme to manipulate users. The cycle goes on until bad practices become standard, other companies copy them without even wondering if that is the right thing to do, and soon every checkout experiences are a living hell.
Manipulation works… until it doesn’t. For every market plagued with abusive user experiences, an upcoming startup is gaining market traction, faster growth and more enthusiastic user engagement with a simple yet revolutionary idea: Treat the user nicely.
Think about all those political institutions, banks, airlines, retail stores, and insurance companies that have done rebranding after rebranding, trying to wash away their bad reputation, without noticing that their practices, behaviours and values are the ones that need to be “redesigned,” not their logo.
Recognizing manipulation
We need to actively look for manipulative practices since their presence could be insidious, flagrant or merely accidental.
If there is a will, there is away. If somehow we managed to make every dark pattern illegal and banish them from the internet, bad actors will find creative ways to achieve similar sinister results. Intent to manipulate is the root of those evils.
I want to suggest simple questions we can use to assess if our design could harm the user.
1. Does the user feel entrapped or disempowered during your flow?
Manipulation resides in seeing power as a finite currency. A manipulator cannot empower the user because the business goal preceded the users’ needs. To succeed, the users’ sense of self and control needs to be removed. Never put the user in a position of power imbalance, information asymmetry and perceived limited options.
Adobe’s licensing scheme is the perfect bad example. Many graphic designers only need to combine Illustrator and Photoshop to produce their designs. They could not care about any other bells and whistles. But that package is never offered; they are forced to buy two “Single App” packages or plainly go for the “All app option” for $10 more. Designers can always move to other options, but there is no guarantee that legacy files will be 100% compatible with past, present, and future versions of Adobe’s products.

2. Do you use social dynamics against the individual?
It seems like we describe the current use and abuse of social media, especially its effect on younger generations. Still, there are many examples I can pick from outside usual suspects. Basically, manipulation happens if there are ways to use your community to coerce you into remain subscribed or involved with a product.
Meetup.com manipulate community leaders by holding community ownership way from them. The only way to communicate with the members of the community you created is by using the message options meetup offers; it does not allow you to reach them directly by email nor importing that information. So if you somehow need to switch to a similar service, you cannot move your community; you need to start from scratch.
3. Are you urged to hide relevant information, display half-truths or plainly lie?
As users, one of the most dangerous things about manipulation is that it is tough to detect when it happens. We know that we have been victims of it when we feel confused, resentful, frustrated or angry at the end of an interaction.
On the other hand, as designers, if your briefing is actively trying to distract, hide information or manipulate data, so users are intentionally tricked into perceiving a distorted reality, you should raise a red flag right away. User journeys are great tools to detect the inflection points where the correct information needs to be displayed and tracing back the origins of any deceit.
4. Do you actively try to vanish the user’s identity?
If the user does not comply with the product’s demands, the results make the user feel invisible, unrecognized or diminished. Repeat this multiple times, and the users’ sense of self will start deteriorating and concludes that his/her input does not matter.
Those are the basic mechanisms of voter suppression, and sadly we get to see similar practices in design patterns, feedback tools and crowdsourced data visualization. I have seen instances where some countries are filtered out from NPS surveys because the stakeholders believe the users in those countries provide bad reviews, are hard to please or culturally never offer top marks, which defeats the idea of having a net promoter score and gives up providing better services to entire populations.
5. Do you spend your user research budget on finding vulnerabilities?
User research should not be about finding “the right buttons” in the users’ mental models to exploit them for private gain. Is the targeted demographic already physically exhausted, mentally distressed, exited or exhausted? That is a red flag.
With the rise of neuromarketing, increased biometrics availability in devices and normalization of data mining users’ behaviours, the tools for manipulation had become more advanced and potentially more dangerous.
It is hard to estimate the extent of the damage in modern democracies the negative influence Cambridge Analytica had. This British consultancy firm misappropriated 87 million Facebook users' information and applied it to consult strategic communication during electoral processes in the UK and USA. The like button on Facebook can be weaponized and used as a way to identify your political leanings.
The long way ahead for Design Ethics
Battling manipulation in UX requires rendering those bad practices ineffective by teaching the users about dark patterns, raising awareness of its prevalence, not complying with internal or external pressure to use them, and bringing real innovation driven by design thinking.
As design leaders, team members and individual contributors, we cannot turn our attention away from our designs' potential damage. This your opportunity to elevate the conversation, start early before bad practices become normalized or the reputation of UX as the profession you love and respect become tarnished by bad actors.
If you happen to work in an environment where despite your best efforts to expose and change those behaviours still user manipulation is rampant, you need to assess if these practices are ego-congruent, which means, the company produces manipulative experiences because it fits with its worldviews, therefore there is no inner conflict or shame to leverage change. If that happens to be the case, the best alternative for you is to move on and find a company with a healthier culture.
It will most likely be easy to walk away from those kinds of companies since they tend to be manipulative to their employees. Use the same questions, find traces of bad behaviours, and develop an escape plan right away if you happen to be a victim. You cannot out-manipulate a cynical manipulator, do not waste your energy.
Live your values every day, set healthy boundaries and plant the seeds of good karma in your designs.