If you’ve ever accidentally subscribed to an email list, installed some software you didn’t want, or been tricked into needlessly sharing personal data, you’ve already experienced a dark pattern, or a maliciously-designed user interface.
One of the most notable examples from 2018 was the financially-struggling MoviePass app, which, in August, sent a message to users trying to trick them into un-cancelling their plans. That’s a dark pattern. Confusing checkboxes with double negatives? Dark pattern. Hidden conditions in terms of service that no one reads? Dark pattern. Even emotional pleas to keep you subscribed to email lists qualify as dark patterns, and because they are often quite profitable, new UI (User Interface) tricks are popping up all the time.
Why “dark patterns?”
The ominous-sounding name was coined by UX designer Harry Brignull as a way of getting people to notice and remember the broader category of deceptive interfaces. The general idea is that product design can either help users navigate through choices in a clear, organic way, or it can take advantage of the fact that users tend to behave in predictable patterns to nudge them into doing things they don’t want to.
We skim web pages instead of reading them, we click the green button for yes and the red button for no, we’d rather pay a hidden fee than go through checkout again, and in general we expect things to follow certain norms and standards. When we apply these expectations to a well-designed interface, it helps us navigate through things quickly, but getting too comfortable can leave us vulnerable to dark patterns.
Types of dark patterns
The “official” site (maintained by Brignull) lists twelve different dark pattern types. Here are some of the more common ones.
Bait and Switch
When you are led to believe that something you do will have a different result than it actually does. For example, for part of 2016, if you clicked the “X” button on a Microsoft Windows 10 upgrade reminder, it didn’t exit but actually started installing the upgrade.
This is what happens when you’re not shown the full cost of a product until you’ve put in enough work that paying a “convenience fee” or an extra shipping charge at the end seems easier than starting over.
This happens when a company tries to “shame” you into something. For example, unsubscribing from an email list might take to you to a page that says, “Aw, don’t you like us?” Another version might be an ad for a workout program that offers you the option to either “Enroll” or “Stay out of shape.” There’s a whole blog dedicated to these.
Ever clicked a download button that wasn’t a download button? A play button that wasn’t a play button? Disguised ads and download links are everywhere, especially on sites where things are being offered for free.
Free trials might get some people to sign up for the quality of the product, but more often, once they get your credit card, they just hope you’ll forget about cancelling.
This situation is easy to get into but hard to get out of. You can sign up for our service online, but you’ll have to hand-deliver your notarized cancellation letter to our headquarters in Nebraska. (Hotel California, anyone? — You can check out any time you like, But you can never leave!)
Hey, you just joined our site! How about you give us access to your email contacts so you can see if you have any friends here, and maybe we’ll email them with other stuff, too. LinkedIn is the most well-known real-world example of this, but the thirteen million dollar lawsuit that resulted may have cut into their profits a bit.
This almost isn’t even its own dark pattern, since it’s part of so many others. It plays on your expectations about the way things should work, like using a red button to continue and a green button to take you back, or using another dark pattern, like “Trick Questions.” “Do you want to not opt out of our email program?”
Yes, it’s named after Mark Zuckerberg, and yes, it’s just as hard to avoid as Facebook. While some attempts to grab your data are obvious (take the “Which car manufacturer should advertise to you” quiz!), a lot of it happens because you didn’t read the terms and conditions (which are kind of their own form of dark pattern) and gave permission for your data to be distributed. And you really can’t get it back.
How can I avoid them?
Currently, the human brain contains the only software that can reliably detect and block dark patterns, and the best way to use it is to look at plenty of examples so you know what to watch for. The “#darkpatterns” hashtag on Twitter is the most popular way to report them, but if you’re more of a Reddit person, there is a subreddit that showcases dark patterns and other design sins. New varieties are always emerging, but being familiar with the basics will help you understand how they work and how to avoid them.
The dark side has cookies
Dark patterns are nothing new. Mail-in rebates, deceptive font sizes on posters, poorly-labeled prices on store shelves, add-ons when you’re buying a new car – the art of trying to trick users into making poor choices has a pretty impressive history. With tracking tools like browser fingerprints, cookies, big data analysis, and live a/b testing, it has only gotten easier for designers to figure out how to target consumers effectively. Truly deceptive practices can actually be against the law (companies do get sued), but if you don’t want to not opt out of whatever dark patterns are trying to make you reverse undo, you can do your part in improving user interfaces by a) not falling for dark patterns, and b) sharing them with the world.