Have you ever heard of the term ‘dark pattern’? If you haven’t, you’ve most definitely experienced many in your time online.
A dark pattern refers to a UX design that deceives the user, taking advantage of the way that people habitually use websites and apps to try and ‘force’ a specific outcome.
The term was originally coined by Harry Brignull in 2010. Brignull is a UX specialist and noticed that a large number of brands were using certain deceitful tactics to manipulate the results they wanted.
“When you use websites and apps, you don’t read every word on every page—you skim read and make assumptions. If a company wants to trick you into doing something, they can take advantage of this by making a page look like it is saying one thing when it is in fact saying another. You can defend yourself by learning about dark patterns…”
A UX designer should be championing the user, striving to create an interface that makes navigation simplistic. The issue arises when brands (understandably) want to achieve the best results, whether the goal is to get more newsletter sign-ups or monthly subscriptions. Dark patterns enable brands to achieve these goals quicker, with very minimal time spent strategising a genuine solution that would appeal to its users.
However, dark patterns aren’t the quick win that they may seem…
Types of dark patterns
There are many types of dark patterns that Brignull identified and documented, but there are always new tactics popping up across the internet. Here are a few examples of common dark patterns.
Comparison prevention
When a brand offers three tiers of subscription levels, each priced individually relating to the features that tier offers, the customer should be able to fairly compare the prices and the offerings to determine which is best suited to their wants and needs. If the brand displays this information in a complex manner, it can be much harder for the user to compare.
In instances of comparison prevention, the brand has likely displayed the information in such a way that the user is influenced towards a higher-priced package, even if they don’t require the features.
An example of this can be seen here:
A user was trying to purchase a Google Workspace paid plan. There are cheaper options than the Business Standard that is shown in the image, but they were unable to select a lower option. It turned out that the only way around this was to sign up for the free trial of Business Standard and then manually downgrade to a cheaper plan.
This is incredibly inconvenient for the user, but brands are banking on the fact that many people either a) won’t figure out how to downgrade so will settle for the plan that’s been forced upon them, or b) forget to downgrade and be charged the higher price when the trial ends.
Confirmshaming
This is a very prevalent dark pattern, often seen in situations where a user is trying to cancel a subscription or cancel a purchase midway through. Confirmshaming refers to when the user is emotionally manipulated into stopping an action.
In the below example, the website has highlighted that the user is using an ad blocker. Of course, the site wants its ad revenue, so would prefer that the ad blocker be turned off. When prompting the user to turn it off, the option to decline is shown as ‘I am a bad person’, shaming the user for going against the best wishes of the site.
Disguised ads
This is another very common dark pattern. A disguised ad appears as though it is another interface element of the site you’re on, but clicking it will actually redirect you to the advertiser’s landing page.
Disguised ads can be particularly crafty when a user is trying to download something from the site. In the above example, the red box in the top right corner indicates the actual download link for the font, but since the ad is displaying a download button in much bigger and clearer text directly beside the font itself, it’s easy to think that it’s the right thing to be clicking.
Fake scarcity
Have you ever been browsing an online retailer and seen a pop-up on an item you’re viewing letting you know that it’s the last one? Or perhaps that X amount of people have it in their basket currently? Creating a fake notation of limited supply or popularity can often prompt a user to quickly make a purchase that they were perhaps going to leave for another day, or pass up entirely.
Of course, the product may genuinely be low in stock, or incredibly popular, but the majority of these aren’t actually linked to any data, they’re simply written into the site’s code.
Fake urgency
Similar to fake scarcity, fake urgency places pressure on the user to complete an action because they appear to be under a time constraint. This can be seen when a site places a countdown, forcing users to quickly make a decision or purchase before it reaches zero. Again, the countdown could be legitimate, such as for the end of a discount sale, but it’s impossible to tell.
Forced action
This is when a user is trying to complete one action, but they are required to do something less desirable to achieve it. The user may not even notice this second action is happening, known as sneaking, or they may have been tricked into believing that it’s more desirable than it really is through confusing and misleading language, known as trick wording.
LinkedIn was found to be using forced action back in 2015. When a new user registered for the site, they were met with the below screen, prompting them to input their email address. An email address is required when signing up to most sites, so not many users flagged this as anything strange. Little did they know that LinkedIn was actually gaining access to their email inbox and extracting all of the email addresses it could find.
If you’re a bit more eagle-eyed than most, you may have spotted the grey text below the ‘continue’ button. This text does inform the user of what LinkedIn plans to do with their information, but since the light grey text on a light blue background doesn’t promote easy visibility, it’s very easy to miss. Users were also given an option to skip this step, seen in the lower right, but as the ‘continue’ button is much more prevalent on the page, it’s easy to miss this too.
Hard to cancel
In the UK, we have regulations in place that mean any brand offering a subscription, or another form of signing-up, like a newsletter, must have an easy option for users to unsubscribe when they wish. Unfortunately, this isn’t always the case. Brands want you to sign up for their mailing list, or subscription service, but they don’t want you to leave!
The Pudding conducted some research into this particular dark pattern, which is a very interesting read and is also displayed in an appealing way, so definitely check it out (but bear in mind that this research was conducted in the USA where they have different regulations to us over here).
They tested 16 online subscriptions to see how easy it was to opt-out. It was found that none of the 16 sent an email reminder that the free trial was due to end, even though some of them claimed that they would, and multiple outlets made it incredibly difficult to unsubscribe. Some required a phone call, which took a fair amount of time to get connected with their team. Others had their ‘unsubscribe’ button buried deep in the site, which took some real navigation to find. Definitely not user-friendly.
Hidden Costs
Hidden costs are exactly what they sound like. A product or service is advertised at a lower price, but once the user reaches the checkout, unexpected and unavoidable fees and charges are added on. Sometimes this will be listed as a ‘service fee’, or a ‘handling fee’.
Hidden subscription
This is when a user unknowingly signs up for a recurring subscription or payment plan without clear disclosure or explicit consent beforehand. See if you can spot the disclosure of a rolling payment plan in the below image:
Looks like a one-time payment of $99.99, right?
Wrong.
Hidden in white text on an almost-white background is the disclosure that the plan is $99.99 a month. What’s the bet that unsubscribing and trying to get a refund for this subscription is equally as difficult?
Nagging
Nagging refers to when a user is trying to complete one action, but is constantly interrupted by other requests. This is seen all too commonly within mobile apps. Of course, the app wants you to leave a rating and boost their reviews on the app store, but these pop-ups occur at random points when the user is simply trying to enjoy the application.
Preselection
Similar to the Google Workspace example at the start of this blog post, preselection refers to when a user is given a default option that has already been selected for them, designed to influence their decision-making.
Visual interference
Relying on human nature, visual interference is a method of hiding, obscuring or disguising information to guide users towards the brands’ most favourable outcome.
There are many cues that our brains automatically relate to certain things. Red is often ‘stop’ or ‘no’ and green is ‘go’ or ‘yes’. Left is often ‘back’ and right is ‘forward’. Crafty UX designers can play on these assumptions to guide the user’s decision, hoping that the majority of users aren’t paying full attention to the content on the screen, working off instincts instead.
London Zoo was found to be utilising this technique to promote more donations when users purchase tickets online. In the below image, the green arrow pointing right is what the majority of users would assume is the next step, while the white arrow pointing left is assumed to take you back a step in the process. However, the green right arrow tacks on a donation, whereas the white left arrow proceeds without.
On the idea of arrow direction symbolising a specific action, do you think this cookie toggle is on or off?
For most people, the toggle sitting left would indicate ‘off’. However, it’s actually ‘on’ here, designed to lead users into thinking they’ve disabled analytics and performance cookies, when they actually haven’t
Here’s another example of visual interference with the design of buttons and text:
If you’re trying to progress through an unsubscription process, surely the big green button is the way forward? But it’s not. It’s the light and thin text just below the button instead.
The legality of dark patterns
The UK already has some regulations in place to protect customers and their information:
- Brands must check if customers are happy to be contacted by fax, phone, post or email and give them a chance to object. It must also be provable that consent was asked and the option to object has been given.
- When collecting customer details, brands must get their permission to send other offers or promotions.
- Brands must ask the customers’ permission if they want to share their information with another organisation.
- Customers have the right to stop their information from being used for direct marketing and brands must make it easy to opt out.
- Brands must tell visitors how their website uses cookies and ask if they would like to accept them. This information must be easy to understand.
Although these are all important regulations to have in place, there are plenty of loopholes where brands can still get away with dark patterns. However, the Digital Markets, Competition and Consumers Bill proposes a suite of changes to unfair commercial practices and may be utilised to take action against dark patterns.
The UK Competition and Markets Authority and Information Commissioner’s Office have joined forces to provide clarity to the ways that certain online design practices, referred to as ‘online choice architecture’ (OCA), can influence consumers’ decisions.
Key examples of potentially harmful OCAs that they have highlighted include:
- Pop-ups that make it harder to refuse cookies than accept them
- ‘Biased framing’ of the benefits of sharing personal data – the use of “leading language” to emphasise the benefits while minimising or ignoring any potential negative impacts and risks.
- Intrusive default settings
- Bundled consent
- Use of language which pressures or shames a user into sharing personal data
These changes are still in their early stages, but it’ll be interesting to see if a clamp on the most deceitful of dark patterns does come into play in the coming years.
Why can dark patterns be detrimental to brands?
As a business, it can be easy to view the benefits of dark patterns without really considering the consumer. Anything that can boost visibility and revenue is often good in a brand’s books, but sneaky techniques like dark patterns can do much more harm than good.
Loss of trust
Using deceptive and manipulative techniques to influence your customer’s behaviour can lead to them feeling misled or tricked, and rightly so! This breaks down the trust they have in your brand, which can be very challenging to rebuild.
Reduced loyalty
Brands must provide positive and honest experiences to their customers to build loyalty. Dark patterns are manipulative, which is directly opposite to the core principles of transparency and fairness. Customers are much less likely to remain loyal to a brand that engages in deceptive behaviour.
Reputation and brand image
People talk, and word of mouth is a powerful force. If customers regularly experience dark patterns on your site and decide to share their negative experiences with others, this publicity can harm your reputation. Kiss goodbye to the financial investments (and time!) you’ve put into building a positive and consistent brand image, as using dark patterns can portray your brand as an untrustworthy outlet that is only focused on profits at the expense of your customers.
Legalities
Some dark patterns may already violate consumer protection laws and regulations, but with potential changes on the horizon, even more deception methods will likely do too. This could result in legal consequences, fines and other penalties.
Dark patterns may seem like a quick win when it comes to hitting targets and quotas, but in the long run, brands that prioritise ethical and user-friendly practices are more likely to build lasting relationships with their customers.
In the world of marketing, putting customers and their experience first will always come out on top.