How web sites use “darkish patterns” to govern you
There’s no denying it – the internet is annoying. From endless website popups and inevitable cookies that track your every step online, to chirping “notifications” embarrassing us for registering for useless lists, today’s use of the web needs to be pushed, pulled, and broadly tampered with to pay attention to something.
Why are companies so relentlessly harassing us at the risk of alienating the very consumers they are trying to advertise? Simple, it works really, really well. As a unique study shows, such tactics are very effective at getting people to sign up for services they actually don’t want. Despite the conventional wisdom that websites can only annoy users so much before they scurry the other direction, many of these tactics don’t even make users noticeably angry, the study found.
In the article published in the Journal of Legal Analysis, Jamie Luguri and Lior Strahilevitz of the University of Chicago conducted several experiments that involved thousands of participants who largely resembled the US population. Strahilevitz, a law professor, and Luguri, a trainee lawyer and graduate of the University of Chicago, tricked attendees into thinking they were “bait-and-switch” into a “privacy plan” and then encouraging them to progressively assert themselves multiple click manipulative interfaces.
In one experiment, the control group was allowed to click “Accept” or “Decline” while the other group was presented with a screen with the options “Accept and Continue (Recommended)” or “Other Options”. Clicking “Other Options” led to the following choices: “I don’t want to protect my information or my credit history” or “After reviewing my options, I want to protect my privacy and get privacy and credit history monitoring.” “
“This is a classic dark obstacle pattern – we make it a little more painful, we soak up some of their free time when they want to say no,” Strahilevitz said in a recent presentation to the Federal Trade Commission. The second screen that requires users to click on a statement they disagree with in order to decline a service is a technique called “Confirm Shame,” which is widely used by shopping websites and popular magazines.
The two screens more than doubled the number of people who signed up for the researchers’ fictional privacy plan. Only 11% agreed to the plan in the control group, while those in the experimental group presented with the two screens fully agreed to sign up for a quarter.
Technical security and data protection
In particular, people who were effectively tricked into signing up for the service didn’t seem any more angry than those given the straightforward options. A third group, who went through even more aggressive pop-ups, signed up at an even higher rate – 37% to 42%. The participants in this group expressed more anger, but only those who ultimately rejected the plan. Those who signed up weren’t noticeably upset, despite being forced to click through multiple manipulative screens.
The result: manipulative design is very effective in getting people to buy products they don’t really want. Many of these buyers are far from being nudged and poked in this way and don’t even realize they’ve made a purchase.
“We’re seeing dark patterns multiply because they’re extremely effective,” Strahilevitz said. “Using dark patterns seems to be beneficial for businesses … they can just use a few dark patterns and get away with it without alienating their consumers.”
Strahilevitz noted that he doesn’t believe his paper will be the first to test manipulative design in this way – just that it will be the first to be publicly available. “I suspect that many social scientists who work in-house for e-commerce companies have conducted studies exactly like what Jamie and I have done for years. We’re just the first to publish the results and share this data sharing the world, “said Strahilevitz.
Dark patterns everywhere
In the past decade, manipulative website design has grown from a handful of businesses to a near ubiquitous online and consumer app.
“Imagine running a business and pushing a button to get your customers to spend 21% more. It’s a breeze,” said Harry Brignull, who first coined the term “dark patterns” in 2010 and since then has cataloged many of these Marketing Tricks online, said at the FTC workshop.
A decade ago, Brignull said, he thought that exposing these techniques would be enough to shame companies into abandoning them. But that didn’t happen, he said – because for many companies, the rewards of these designs are too good to give up.
“Consumers haven’t sat back, companies haven’t regulated themselves, and they haven’t made it easy enough to speak up, either,” he said.
Other research has shown that consumers are more cautious online than they are when shopping in a store or interacting with a salesperson, according to Katharina Kopp, deputy director of the Center for Digital Democracy, which advocates stricter regulation of e-commerce Has.
“Consumers are a little more suspicious of offline retailers and traditional stores,” she said. “They trust more online.”
Although many consumers are generally familiar with the type of sales tactics used in traditional advertising such as television commercials or billboards, digital spaces have changed the rules of the game, Kopp said.
“The entire interface between companies – companies trying to sell products and services – and the consumer is used by companies to optimize the interaction in their interests. Traditionally, we have thought about advertising to achieve this. But they are clear also use the entire design surface, “she said.
The FTC is now asking the public to contribute to regulating websites for manipulating users.
Costs “had no effect”
Elsewhere in the experiment, the researchers looked at which types of “dark patterns” were most effective and which did not seem to matter. In one version, attendees were told that the “privacy plan” costs $ 38.99 – far more than the most expensive credit monitoring options available commercially.
Neoclassical economics predicts that the higher price should have made consumers more attentive and “less prone to making mistakes or making impulsive decisions,” the paper says. But that didn’t happen – consumers were just as likely to accept the more expensive plan when exposed to manipulative design.
“Tripling the cost of a service did not affect adoption in this area,” the authors concluded. Instead, the interface determined whether or not a user was accepted.
Other tactics that resulted in users accepting the “Privacy Plan” included hidden information where the popup did not reveal the price of the service, just “Terms and Conditions apply,” with the price displayed when the user clicked on an asterisk. About 30% of people accepted the plan under these conditions – though most of them likely didn’t realize they had signed up for something.
Arunesh Mathur, et al.
Another tampering tactic, where users were told that a number of others had signed up for the plan, also worked. People exposed to this pattern were 50% more likely to accept the service than people in the control group. The tactic is known as “social evidence” as it attempts to manipulate consumers through the social pressure of thousands of other unspecified consumers. As researchers showed in 2019, these other consumers are sometimes compound.
At least one type of dark pattern – a timer that says the offer was only available for 60 seconds – wasn’t effective, according to the study.