Dark patterns in UX: how designers should be responsible for their actions

Arushi Jaiswal
UX Collective
Published in
12 min readApr 16, 2018

--

Illustration Credit: Neha Ann Balthazar

Abstract

Dark Patterns are deceptive UX/UI interactions, designed to mislead or trick users to make them do something they don’t want to do. This term was coined in 2010 after the boom of ecommerce industries on the web. In order to generate more sales, get subscriptions, and hit target numbers in transactions etc., designers and business associates started creating deceiving user interfaces to manipulate users.

This article is divided in two sections. The first section includes a detailed description of Dark Patterns, its history and different types explained with relatable examples.

The second section of this article briefly covers and talks about big picture impact of Dark Patterns and possible solutions to it.

Overall, this article highlights how creating Dark Patterns is not a good, ethical practice and how designers should be responsible for their actions and focus on creating transparency with the users instead of manipulating them.

Introduction

Good user experience design is about providing the users with seamless, enjoyable interactions with products. It has the user’s best interest in mind, and is not deceptive or sneaky in nature. These phenomenal experiences are created by various design practices after studying and learning more about the user. This approach can make a product stand out and provide good results for the businesses. Or, it can be used to manipulate the user and tricking them into making decisions that aren’t necessarily good for them but benefit the company. The latter practice is called creating ‘Dark Patterns’.

What are Dark Patterns?

According to Brownlee (Fast Co. Design), a dark pattern is a misleading or otherwise deceptive UI/UX decision that tries to exploit human psychology to get users to do things they don’t really want to do.

As a term, Dark Patterns was first coined by the London –based UX designer Harry Brignull (PhD Cognitive Science) in August 2010. He defined it as,

“a user interface that has been carefully crafted to trick users into doing things, such as buying insurance with their purchase or signing up for recurring bills.”

Brignull (2010) further explains that when we think of “bad design”, we think of the creator being sloppy or lazy but without ill intent. Dark patterns, on the other hand, are not mistakes. They’re carefully crafted with a solid understanding of human psychology, and they do not have the user’s interest in mind.

Dark Patterns : Then and Now

Dark patterns have been around for as long as we can remember; they aren’t limited to the web. For example, some credit card statements boast a 0% balance transfer but don’t make it clear that the percentage will shoot up to a much higher number unless the user navigates a long term agreement in tiny print.

Example — credit card statements boast a 0% balance transfer but don’t make it clear that the percentage will shoot up to a much higher number

As far as the early web is concerned, we all know about the classic pop-up ad, saying we’ve won random sweepstakes to spam us.

In terms of web now, Dark Patterns are much more complex and sneaky in nature. Brownlee (Fast Co. Design, 2015) illustrates a key example of that through LinkedIn and its automated follow-up email reminders on a new user’s behalf to any contacts harvested from his or her webmail accounts, which are presented in such a way that they appear as if they came directly from the user.

In late 2000s there was this huge wave of LinkedIn spamming our inbox with dozens of follow-up emails through our contacts to “expand our professional network” and the worse part was that they were virtually impossible to get out of. Thankfully, this pattern was recognized and presented in San Jose’s US District Court (Perkin v. LinkedIn, 2014) with the key issue being spam.

This cost LinkedIn a class-action lawsuit with a penalty of $13 million. It also served as a warning to other companies that misdirect users by using such tactics and dark UX patterns to artificially grow their products.

Types of Dark Patterns

After coining the term, Harry Brignull registered a website called darkpatterns.org,

“a pattern library with the specific goal of naming and shaming deceptive user interfaces.”

He also explained how these patterns work by giving us an example,

“When you use the web, you don’t read every word on every page — you skim read and make assumptions. If a company wants to trick you into doing something, they can take advantage of it by making a page look like it is saying one thing when in fact it is saying another. You can defend yourself against dark patterns on this site.”

Brignull further listed 11 types of dark patterns on his website –

  1. Bait and Switch
  2. Disguised Ads
  3. Forced Continuity
  4. Friend Spam
  5. Hidden Costs
  6. Misdirection
  7. Price Comparison Prevention
  8. Privacy Zuckering
  9. Roach Motel
  10. Sneak into Basket
  11. Trick Questions

In this article, I will be describing each one in detail and will provide relatable examples for the more common ones.

Bait and Switch

As explained by Brignull, this pattern is when a user is looking to take an action that results in a desired outcome, but instead ends up resulting in something completely unforeseen.

Rutherford (2016) explained this further with a Windows 10 upgrade example. Usually when we press the X at the top right corner of an upgrade pop-up, it closes the window with no further fuss. In the Windows 10 dialog, this action resulted in the upgrade being initialized.

He also suggested alternatives to bait and switch to designers by stating that the ideal strategy is honesty. He suggests that we provide content that provides value to users and then ask for a favor in return. For example, UX pin offers free yet useful Ebooks in exchange for an email address.

Disguised Ads

As the name suggests, this pattern is adopted so that ads are disguised in the page, as if they were a part of the regular content or navigation. That is supposed to make users click them more often (Brignull, 2010).

One of the examples of disguised ads that I found during my research is from Dafont.com (a free font download site). It disguises ads and misleads the users into clicking them. As shown in figure 1, the main download button is much smaller and less visible than the download option for ZipMac, which has nothing to do with the font that the user would want to download.

Screenshot of dafont.com showing two different disguised ads and how they look like a part of the main content of the site.

Forced Continuity

Found on a large number of subscription based websites that give out free trial, Forced Continuity is a dark pattern in which the user signs up for a free trial but has to enter their credit card details. When the trial ends, they start getting charged. There’s no opportunity to opt out, no reminder, and no easy way to cancel the automatic charging of their credit card (Brignull, 2010).

In my observation, companies like Hello Fresh, Blue Apron, and Ipsy are guilty of this. However, Rutherford (2016) does make an important point that it is understandable that some businesses use this method to avoid spammers and being overused, but they are potentially alienating their users by doing so.

I noticed this dark pattern on the Coursera.org website. Coursera, an online learning platform providing universal access to the world’s best education, has it’s UI is designed to push learners towards their paid offerings, and it confuses new learners regarding what’s free on the platform and how to sign up for it.

For example, the free (no certificate) offering is buried deep in the website and is extremely difficult and time taking for the users to find it. They end up signing for the 7-day free trial version instead. The free version is in the ‘Audit’ section shown which is almost impossible for a new user to figure out. The user generally ends up picking the 7-day free trial version, which needs the their credit card information.

Friend Spam

This kind of Dark Pattern occurs when the product asks for the user’s email or social media permissions under the pretense it will be used for a desirable outcome e.g. finding friends, but then spams all their contacts in a message that claims to be from them. (Brignull, 2010)

As mentioned earlier, the most famous example of this dark pattern was used by LinkedIn, which resulted in them being fined $13 million dollars as part of a class action lawsuit in 2015.

Hidden Costs

Brignull (2010) describes the pattern of a user going through multiple steps to checkout and after finally getting to the last step of the checkout process, discovering some unexpected charges have appeared, e.g. delivery charges, tax, etc. as ‘hidden costs’.

While a lot of E-retailers are trying to be transparent about it, I recently encountered a modern day example. Curology, an acne treatment subscription, advertises a monthly prescription based bottle to the subscriber at $19.95/month. The cost however is not that, it is $19.95 + $4.95 (shipping) which is revealed much later in the process of registering.

In order to correct this they did start a 2 month-plan selling a bigger bottle to cut shipping costs. It ended up being even more confusing because now the cost is $39.95 for two months and not $19.95/month which is primarily advertised on their website.

Misdirection

Misdirection is created when the user’s attention is guided to a specific place so they won’t notice something else that is happening. (Brignull, 2010)

Estevao (Medium, 2017) named ‘Skype Software Update 2016’ example as misdirection and explained how it took advantage and pre-selected Bing.com to be the user’s default search engine and MSN their homepage while updating the application.

She further explains,

“In many cases, the installation occurs in a pop-up window, where you have to read and go through multiple steps, configuring options when required. But every UX designer knows that people don’t read, they scan. What happens, in the end, is that people only look at the window for enough time to find the buttons that will make the installation go through and they click ‘ok’ or ‘next’ without giving it much thought. Throughout these windows, it’s common to see some other software or plug-in being “pushed” to the user, as they leave the option that accepts these programs checked by default. When speeding through the interface, the user doesn’t realize that he accepted something unexpected.”

Price Comparison Prevention

In this pattern, Brignull explains that the retailer makes it hard for the user to compare the price of an item with another item, so they cannot make an informed decision.
For example, LinkedIn always advertises its Premium plans and gives its users free trial but never reveals the price of it in the first place.

Privacy Zuckering

Brignull named this dark pattern after Facebook’s CEO (Mark Zuckerberg) because it was first identified on Facebook. It’s about tricking the user into publicly sharing more information about them than they really intended to.

Back in 2010, almost all companies were full of tricks to make it confusing for the user to understand but it is a lot more transparent nowadays. For instance, Rutherford (2016) gave an example from Zapier.com and how it posts two different versions of its Terms of Service, one that’s written in plain English so that anyone can understand it, and another that’s filled with the legal jargon that can be signed without reading.

Roach Motel

This type of dark pattern is pretty common and fairly relatable by all. The design makes it very easy for the user to get into a certain situation, but then makes it hard for them to get out of. For example, a subscription. (Brignull, 2010)

In my personal experience and observation, the Times Jobs India website is a perfect example. While looking for jobs, I made an account with them in 2013 and till now in 2017, I haven’t figured out how to delete my account or unsubscribe from the daily email.

Trick Questions

Brignull (2010) described this as,

“You respond to a question, which, when glanced upon quickly appears to ask one thing, but if read carefully, asks another thing entirely.”

He even explains it with an example, but keeping the more recent one in mind, I’ll be mentioning Estavo’s (Medium, 2017) example on Sky 2015. Sky’s checkout page has an opt-in/opt-out checkbox that isn’t checked by default. But the sentence says, “Sky may contact you about products and services you may like unless you click to opt-out.” She states that the construction of the sentence is purposefully confusing and tries to trick users into subscribing for newsletters.

Big Picture Effects of Dark Patterns

Dark Patterns have been growing slowly and steadily. Initially they were obvious and frustrated the user resulting in short term gains. But with time, they have been improved and are now designed to trick the more aware user as well.

Campbell-Dollaghan (Fast Co. Design, 2017) explains how Dark Patterns are now being wielded as a weapon against democracy. She talks about how fake news during 2016 US Elections is a major dark pattern. To explain this further, she gave the example of companies based in Macedonia fabricating news pro-Trump stories designed to boost clicks, engagement and profits from value American users on Google and Facebook. BuzzFeed News documented this and interviewed one person behind a fake news operation. He said,

“Yes, the info in the blogs is bad, false, and misleading but the rationale is that ‘if it gets the people to click on it and engage, then use it.”

Campbell-Dollaghan (Fast Co. Design, 2017) also talks about Facebook’s NewsFeed algorithm. The algorithm is designed to serve new stories, both true and false, to users depending on their observed opinion. That level of personalization forms a confirmation bias and ends up narrowing an already self-curated perspective, which restricts the readers’ ability to organically broaden their knowledge.

She even criticized both Facebook and Google for misleading users by lending legitimacy to lies through design as both of them used the same interface elements for ABCNews.com and ABCNews.co.co (fake counterpart).

Campbell-Dollaghan (Fast Co. Design, 2017) believes that we are in a new era now. The main goal of the old world was to design ways to engage and make their lives easier but for the new era, she encourages us to “give users the agency to understand and challenge the products they’re being sold.”

Possible Solutions to Dark Patterns

There are no easy solutions or alternatives to dark patterns. Industry insiders like Bunker (2013) have suggested that designers should have an ethical code of conduct where privacy, honesty and respect should be the core elements.

Nir Eyal, the author of the book Hooked published in 2014, came up with a more logical solution. In his book, he speaks about the power of persuasive design and explains how a good understanding of cognitive science can add value to the user’s experience. He spoke about the ‘Hook Model’ that provides the designers the power to build habit-forming products. He realizes that his model can be misused and hence spoke about the morality of it. He believes that manipulation is an experience crafted to change behavior and hence offered designers the ‘Manipulation Matrix’. This matrix, however does not try to answer which businesses are moral or which will succeed. It just seeks to help the innovator/designer answer, “Should I attempt to hook my user?”

Despite these possible solutions, there is still a gap. It is the designer’s moral code. The implications of the products they create needs to be considered.

Conclusion

As Campbell-Dollaghan (2017) stated, we are entering a new era where we shouldn’t be using dark patterns as weapons to influence our world.

I believe that as user experience designers we need to think beyond providing aesthetics and usability to our users. We help shape the lives of people everywhere and our decisions can have significant impacts on the way we as a society behave. Giving in to dark patterns for meeting short-term goals is not the solution. As Steve Fisher at the Generate NY 2017 conference said,

“ Find a way to help the vulnerable around you. If you have privilege, use it for good.”

--

--

Product Designer currently in the Music-Tech realm. Located in New York.