clock menu more-arrow no yes mobile
Christina Animashaun/Vox

Filed under:

Dark patterns, the tricks websites use to make you say yes, explained

How design can manipulate and coerce you into doing what websites want.

Sara Morrison is a senior Vox reporter who has covered data privacy, antitrust, and Big Tech’s power over us all for the site since 2019.
Open Sourced logo

If you’re an Instagram user, you may have recently seen a pop-up asking if you want the service to “use your app and website activity” to “provide a better ads experience.” At the bottom there are two boxes: In a slightly darker shade of black than the pop-up background, you can choose to “Make ads less personalized.” A bright blue box urges users to “Make ads more personalized.”

This is an example of a dark pattern: design that manipulates or heavily influences users to make certain choices. Instagram uses terms like “activity” and “personalized” instead of “tracking” and “targeting,” so the user may not realize what they’re actually giving the app permission to do. Most people don’t want Instagram and its parent company, Facebook, to know everything they do and everywhere they go. But a “better experience” sounds like a good thing, so Instagram makes the option it wants users to select more prominent and attractive than the one it hopes they’ll avoid.

A “better ads experience” is subjective.

There’s now a growing movement to ban dark patterns, and that may well lead to consumer protection laws and action as the Biden administration’s technology policies and initiatives take shape. California is currently tackling dark patterns in its evolving privacy laws, and Washington state’s latest privacy bill includes a provision about dark patterns.

“When you look at the way dark patterns are employed across digital engagement, generally, [the internet allows them to be] substantially exacerbated and made less visible to consumers,” Rebecca Kelly Slaughter, acting chair of the Federal Trade Commission (FTC), told Recode. “Understanding the effect of that is really important to us as we craft our strategy for the digital economy.”

Dark patterns have for years been tricking internet users into giving up their data, money, and time. But if some advocates and regulators get their way, they may not be able to do that for much longer.

Dark patterns, briefly explained

Although you may not have heard the term dark patterns before, you’ve surely seen countless examples of them — and experienced their effects:

  • The trial streaming service you signed up for, only to be automatically charged when the trial expired
  • The app interstitial ad you can’t figure out how to get out of because the “X” on the top right-hand corner is too small and faint to see ...
  • ... or the “X” is so small that you accidentally click on the ad itself and are redirected to the ad’s website
  • The drugstore account you have to create to get a vaccine appointment but can’t easily cancel
  • The marketing email that commands you to respond within the next five minutes or else, and includes a fake countdown timer
  • The big pop-up window urging you to sign up for a website’s newsletter with a big red “Sign Me Up” button, while the opt-out button is much smaller and passive-aggressively implies that anyone who clicks is a bad person who doesn’t care about saving money or staying informed

But there are also the effects that may not be as obvious. Dark patterns are used by websites to trick users into granting consent to being tracked, or having their data used in ways they didn’t expect and didn’t want. Or sites will claim they offer users ways to opt out of being tracked (usually because they legally have to do so), but use misleading language or make it especially difficult to actually do it.

Ultimate Guitar’s Pro Access sale has been a few hours away from ending for the past several months, if not years.
Ultimate Guitar

For example: cookie consent pop-ups. Websites will tell you that their sites use cookies and then ask you to “accept” them, usually by clicking on a big, prominent, brightly colored icon. But if you want to refuse the cookies, you’ll have to search for and click through to a menu of settings and disable them manually. Most people don’t have the time or desire to do this for every single website they visit, if they even understand what’s being requested in the first place. Companies whose revenue relies heavily on user data don’t want to make it easy for those users to refuse to provide it.

If you don’t want Forever 21 to put cookies on your browser, you’ll have to hit “opt-out” and turn off each category manually.
Forever 21

Harry Brignull coined the term “dark patterns” in 2010 and has been keeping tabs on them ever since on his website (he also wrote about them for The Verge back in 2013). Dark patterns existed in the physical world long before the internet came along: ’90s kids will remember the mail-order music club Columbia House’s amazing deal to buy 12 CDs for just one penny (plus shipping and handling), which then automatically opted them in to a CD-a-month club that was almost impossible to cancel. But the internet has made dark patterns so much more ubiquitous and powerful. Websites can refine their methods using the very specific feedback their visitors provide, optimizing their manipulation at a scale that the physical world could never in its wildest dreams achieve.

“I think the internet has made it easier to industrialize the way in which we persuade and, in turn, manipulate and deceive each other,” Brignull told Recode.

Some of the more obvious and scammy dark patterns — sneaking extra items into shopping baskets or tacking on hidden fees — have been made illegal in some places, and the FTC has gone after some of the most egregious offenders. But the law isn’t so cut and dried when it comes to privacy, data, and consent.

Some websites will use guilt or shaming tactics to convince you to hand over your personal information.
Florsheim

It’s hard to know what’s an actionable deceptive act or practice when there’s no privacy law in the first place. And it’s hard for consumers to know what they’re giving away unintentionally or how it might be used against them when it all happens behind the scenes.

“It’s a bit like invisible health effects from breathing in fumes or getting a radiation dose: At the time, you might not realize it, but it has a hidden impact on you,” Brignull said. “With privacy, it’s quite difficult to think through and understand what the long-term implications are for you. You’re constantly leaking information about yourself to data brokers, and you don’t really know how they’re using it to market to you.”

Because of this, Brignull and a growing number of advocates, regulators, and lawmakers feel that legislation is necessary to stop these dark patterns so consumers can use the internet without constantly being manipulated into spending money, signing up for services they don’t need, or giving up their data.

“Regulation works,” Brignull said. “It can really turn the internet into somewhere that’s nice to be instead of like a complete Wild West environment. And we need it.”

How laws and regulations can stop the worst dark patterns

If you live in California, you already have it. One of state Attorney General Xavier Becerra’s last acts before leaving office to run the Department of Health and Human Services was to add regulations around dark patterns to the state’s Consumer Privacy Act (CCPA). This banned dark patterns designed to make it difficult for consumers to exercise some of the rights that the law provides, like opting out of the sale of their data. Banned dark patterns include forcing users to click through multiple screens, scroll through lengthy privacy policies, urging them not to opt out, or using confusing language.

Washington state’s third attempt to pass a privacy law, currently making its way through the legislature, says that dark patterns may not be used to obtain user consent to sell or share their data — a provision that was echoed in California’s recently passed Privacy Rights Act (CPRA), an expansion of its CCPA.

Federal lawmakers are also paying attention to dark patterns. At a recent House Energy and Commerce Committee hearing about social media and disinformation, Rep. Lisa Blunt Rochester (D-DE) asked Big Tech CEOs Mark Zuckerberg, Sundar Pichai, and Jack Dorsey if they would oppose legislation that banned dark patterns that tricked users into giving away their data. This data, she said, was often used in algorithms that targeted people who are especially susceptible to misinformation.

“Our children ... seniors, veterans, people of color, even our very democracy is at stake here,” Blunt Rochester said. “We must act. And we will assure you, we will act.”

Late last year, the Congress member introduced the DETOUR (Deceptive Experience To Online Users Reduction) Act, the House version of the bill of the same name that Sens. Deb Fischer (R-NE) and Mark Warner (D-VA) introduced in 2019.

“I introduced the DETOUR Act to address common tactics tech companies use that are used to get as much personal data as possible,” Blunt Rochester told Recode. “They are intentionally deceptive user interfaces that trick people into handing over their data.”

The bills target online services with more than 100 million monthly active users — Twitter, Facebook, and YouTube, for example — and forbid them from designing user interfaces that manipulate users into consenting to give their personal data. The platforms also wouldn’t be able to run design change experiments on users without their consent.

Blunt Rochester and Warner told Recode that they plan to reintroduce the DETOUR Act this session.

“I’m committed to working with my colleagues in Congress to ban the use of these intentionally manipulative practices, designed to extract personal information from users,” Blunt Rochester said.

Sen. Fischer did not respond to a request for comment, but she rolled the DETOUR Act into the SAFE DATA Act, the Senate Commerce Committee Republicans’ version of a federal privacy law that they may reintroduce this session.

Finally, the FTC, which would likely be in charge of regulating any legislation about dark patterns, is also taking a hard look at the practice.

“This is a behavior we take seriously,” Slaughter, of the FTC, said.

The FTC plans to hold a workshop on the subject at the end of April, where it will discuss how dark patterns manipulate consumers, which groups may be especially vulnerable or harmed by this manipulation, what rules are in place to stop them, and if additional rules are needed and what they should be.

“I think about this issue much more as one of data abuses than just data privacy,” Slaughter said. “The first step of collecting your data may not be the immediate harm. But how is that data then aggregated, used, transferred to manipulate your purchases, target advertising, create this surveillance economy that has a lot of downstream harms for users in a way that is less visible to the user or the public?”

The FTC’s authority here comes from its mandate to enforce laws against deceptive or unfair trade practices. The agency has gone after violators who use dark patterns where it can. Tricking people into signing up for and paying for subscriptions or services and intentionally making it difficult to cancel them is an obvious and actionable example. Making people think they’re buying something for a set price without making any additional charges clear is another one.

One of the few federal privacy laws we do have — the Children’s Online Privacy Protection Act — gives the FTC authority over many privacy violations against children under 13, and many dark patterns are included in that law. But no such law exists for adults, so confusingly worded privacy policies and opt-outs that lead to data abuses may need legislation explicitly forbidding them before the FTC is empowered to act.

That legislation won’t be easy to write, either. The line between deliberate deception and legally urging a user to make a choice that materially benefits a company can be blurry.

“Part of the challenge with regulating dark patterns are the gray areas: the instances where users of a technology are being constrained in such a way that they can’t exercise complete autonomy, but that they may not be experiencing full manipulation, or perhaps they are being coerced but with a light touch,” Jennifer King, privacy and data policy fellow at the Stanford University Institute for Human-Centered Artificial Intelligence, told Recode.

In lieu of a federal privacy law, Slaughter says she hopes to use Section 18 of the FTC Act to exercise the commission’s rulemaking authority.

“The FTC should have clearer, more direct Administrative Procedure Act rulemaking authority to address these kinds of things,” Slaughter said. “But in the meantime, I’m very excited to use all the tools that we have, including our Section 18 authority, to tackle it. Is it easy? No. Is it fast? No. Is it worth the effort? Yes. Because if we’re just waiting around for Congress to act, we could be waiting for a long time.”

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

Climate

My adult kids found themselves in nature. Will my youngest lose herself in her phone?

Future Perfect

Will AI mean the end of liberal democracy?

Politics

Is the new push to ban TikTok for real?

View all stories in Technology

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.