By now, hopefully the drama is over: tax time, that annual exercise in understanding where the money goes. Every year all Americans (and Canadians!) who bring in an income, and even some that don’t, file their taxes. To ease the burden and understand what they owe or are due, many use software to file. One of the most popular programs to do so is TurboTax, available on a number of platforms. It’s straightforward, does the math, and allows users to file directly after calculations are complete. Unfortunately however, while the software may be clean, publisher Intuit’s marketing is not.
In April of 2019, TurboTax became the object of scandal and outrage. The concern? TurboTax has been using “dark patterns” to hide the free version of the software, required by IRS to allow low-income earners to file online without cost. An investigation by ProPublica reveals all of the ways Turbo Tax advertises it’s “free” service… and then does everything it can to make it impossible to obtain.
Intuit’s use of dark patterns is not a new technique; the concept has been in play for decades. Web use of dark patterns however, is alarming, particularly when personal data and privacy enter the mix.
What is a ‘dark pattern’?
First, it helps to understand what is meant by a ‘dark pattern’. The object of discussion is not a new fabric trend or a style of contemporary art. A dark pattern is a type of layout or interaction that tricks and takes advantage of the user. Think about marketing models and product design: both are intent to communicate messages to the user and use tools in particular ways. Ideally, we want the interfaces we create to encourage certain interactions, such as making a quick purchase or enlisting business services. A dark pattern happens when the designer takes advantage of human psychology or blind spots to manipulate user actions towards the desired outcome. Deception by design, they encourage choices and interactions that may not be in our best interest.
Some of the techniques that fall under “dark patterns” have existed for years, long before the connected world. A grocery store, for example, will usually stock ‘impulse buys’ right near the cashier. These are cheap, quick items that the customer will purchase without detailed thought. For the store, the impulse buys mean a little more money with each sale. The popcorn seller at the movies might make a ’small’ popcorn almost as expensive as the regular to encourage larger sizes. A hefty ‘sale’ sign might be placed next to a high-quality item, but read the fine print and see it intended for the less popular product.
Dark patterns online
Online shopping, where purchases are made by designing web interactions, only increase the possibilities.
Consider the following: you’re preparing for a flight, and book through Delta airlines. As you make a flight selection, the website draws you towards a number of seat upgrades, including an upgrade to first class. When you’re ready to move on, a red button urges you to move forward with the upgrade and pay your fair. Above the red “Pay for Selection Now” button is a light grey button with another option: you can skip the upgrade and move on.
Example curtsy of Ben Davis and Ed Campodonico (2017)
In this example, unless the user is actively scanning all options, it’s far too easy to click forward and end up with an upgraded seat. Delta knows this: users are more likely to accept the upgrade and continue than skip. Some may not even be aware of the upgrade until its time to pay. Some users may see the price at checkout, empty their cart and start over, but some won’t. A number of users will get angry but ultimately do nothing, so they don't need to start over again. Meanwhile, Delta keeps going.
These types of practices aren't illegal, after all you can still navigate to buy your ticket at the original price. However, Delta's design is intent to get that more expensive item into your cart if you’re not checking carefully.
Examples of dark patterns online
Delta is but one of hundreds of dark patterns that exist online. There are so many that @darkpatterns reports new ones daily via Twitter.
Examples of different types of dark patterns include
- Sneaking items into the shopping basket
- ‘Privacy Zuckering’ (see below)
- Misdirection
- Hidden costs
- Sign-ups and situations that are easy to get into, impossible to leave
- Disguised advertising
- Bait and Switch: assume you’re signing up or buying one item, when really you’re buying another
- Free trials that require payment information and automatically charge the moment the trial is over
- Guilt trips and shaming the user
Dark patterns that get us to give up more data
In privacy, there is a growing awareness of companies that use dark patterns to get users to give up personal data, even when they don’t need to. DarkPatterns.org refers to this practice as ‘Privacy Zuckering’, aptly named after Facebook CEO Mark Zuckerberg.
Certainly Facebook provides a number of examples, but the company is not alone in the practice. The Norwegian Consumer Council includes Google and Microsoft in its' report, Deceived by Design, which looks at how companies addressed privacy right as the General Data Protection Regulation started to roll out.
Notable dark patterns the report observes includes privacy options being obscure or hidden away, while choices are "take-it or leave-it". Pop-ups are in use to mislead users over their choices, and critically, the actual results privacy settings have on their experience. As the report observes:
Facebook and Google threaten users with loss of functionality or deletion
of the user account if the user does not choose the privacy intrusive option.
The report also criticizes Microsoft's Windows 10 product, although to a lesser extent. While Windows 10 privacy settings do not require excessive clicks to set up, the design of the system still encourages users to ignore the option.
When dark patterns that use our personal information to manipulate
We know dark patterns are in use to collect personal information (PI), but what about dark patterns that use that information? Although test cases have yet to hit the market, the possibility is far too real. Imagine an interface that uses dark patters, combined with personal data, to manipulate the outcome. What would happen if design combined shaming or guilt tripping with a database of PI? Instead of making general statements intent to make the user feel guilty, the design taps into personal data to advance their point? If the design swept past general statements, such as the health food supplement that argues "Don't disappoint your gym trainer" and into direct, personal attacks: "Don't disappoint Robert, your gym trainer." Could we be forced to make choices thanks to digital blackmail?
There's an even more sinister possibility: using dark patterns to manipulate more than sales. In The Dark Pattern of Fake News, Kendyl Brooke Mounce highlights the use of personalization algorithms, such as those on Facebook, used to effectively spread political propaganda. We know already that social media news feeds tweak what we can see to our personal interests. Dark patterns, combining personalized feed with misleading, if not deceptive headlines, amplify the worst of the process. If designers and engineers aren't careful, sinister parties can game their systems into becoming dark patterns through function and content. Entire networks encouraging users to engage with propaganda and further allow it to infiltrate the subconscious. Opponents argue privacy puts a block on system personalization, but it's also true that block can protect users from having their interests turned against them.
Dark patterns and the law
As of May 2019, dark patterns remain legal in Canada, the United States, and the European Union. However, this may change: as dark patterns become prevalent online, lawmakers are stepping up to stop the practice. For countries within the European Union, the GDPR makes it explicit that privacy must be the default setting, and all notices must be translatable into clear, common language. The Consumer Rights Directive prevents retailers from sneaking items into shopping baskets, or adding hidden costs to the final sale.
In the United States, further efforts are underway to outlaw deceptive dark pattern sales. Senators Mark Warner (D-Virginia) and Deb Fischer (R-Nebraska) have introduced the Deceptive Experiences To Online Users Reduction (DETOUR) Act. Currently a bill draft, the legislation would make dark patterns illegal under the law. It includes language to prevent interfaces that result in "substantial effect of obscuring, subverting, or impairing user autonomy", along with outlawing other deceptive practices without informed consent. How effective the results remain will be dependent on the final act and enforcement, assuming the bill becomes law.
Designers, don't give in to the dark side
In the meantime, marketers and designers beware: while dark patterns work for immediate sales, they come at a price. Dark patterns might look good upfront when they increase sales, but they result in returns and destroy brand loyalty. As Paul Boag reports, dark patterns might encourage an immediate sale, but they're also more prone to arousing buyer's remorse. This results in increasing calls to the service desk, and more returns.
Worse, users who feel tricked or coerced into purchase decisions tend to be resentful of the process. Disgruntled customers, as social media knows, tell people about it. The last thing any business needs is their company appearing as a 'bad example' on numerous design blogs. Using a dark pattern risks hoping for an immediate gain against keeping a longtime user. As any business owner will tell you, that' never good design in the long run.