FTC workshop: “Bringing darkish patterns to gentle” | Proskauer – Promoting Act

[co-author: Nicole Sockett]

The FTC recently hosted a workshop entitled “Bringing Dark Patterns to Light”, the recording of which can be found at the link below. The workshop focused on examining the impact of digital “dark patterns” on consumers and the market.

The term “dark patterns” refers to a number of potentially misleading website design tactics that can manipulate consumer behavior or limit their autonomy. Dark patterns can trick consumers into buying, sharing, or agreeing to items that consumers did not want to buy, share, or disagree with. Dark patterns are also used to make canceling agreements or subscriptions confusing or difficult. Increasingly, companies are also using dark patterns to manipulate consumers into divulging their personal information, which is sold and then used for targeted advertising and manipulation of future behavior.

Some examples of “dark patterns” are:

  • a website that automatically adds items to a user’s online shopping cart;
  • The “bait and switch” in which a user trying to take an action is redirected to a completely unforeseen outcome. For example, if you press an “X” in the upper right corner of a popup, which usually closes the window, a download is actually initiated.
  • “Disguised ads” designed to blend in with other content or navigation on the page and entice consumers to click on them.

Panel 1: “What are dark patterns and why are they used?”

An FTC workshop panel focused on defining dark patterns and identifying the drivers of dark patterns.

According to the panelists, dark patterns have one of six attributes – they 1) deceive, 2) represent an “information hype”, 3) are asymmetrical, 4) are hidden, 5) lead to differentiated treatment or 6) are restrictive.

  • “Deceptive” dark patterns lead to false beliefs – for example, a countdown timer that does not relate to what is being advertised and is intended to mislead consumers into believing that the availability of items or discounts is limited in time.
  • Dark patterns of “information hype” delay or hide important information from users – for example, hidden charges that are only revealed after the user has spent time selecting items.
  • “Asymmetrical” dark patterns make it difficult to access certain options that disadvantage the advertiser. A common example is when websites hide the option to decline consent to cookies (while the “Accept” option is easily accessible).
  • “Hidden” dark patterns manipulate users without their knowledge. For example, if a pop-up asks for a consumer’s email address and phone number in exchange for a discount, although (unknown to most consumers) only information is required for the discount.
  • “Differentiated Treatment” dark patterns discriminate and treat one group of users differently from another – for example, when the only way to advance in a video game is by acquiring features rather than through your own skills (thereby treating users who have the means and means) willingness to pay different than those who do not).
  • “Restrictive” dark patterns eliminate certain options from user interfaces altogether. For example, to sign up for a service, a consumer must agree to both the terms and conditions and the marketing emails in order to proceed.

While the panelists were unique in their own way, they found that dark patterns ultimately work in two ways: (1) by manipulating the flow of information to users, or (2) by modifying the decision-making space for users, and ultimately by influencing users’ choices.

A study by one of the panelists showed how and how often a consumer experiences dark patterns depends on the consumer’s preferred user interface. According to this study, the use of dark patterns and the variety of dark patterns used is higher with apps than with mobile or desktop websites. While this may be due to many factors, the panelist felt that it shows the importance of understanding that a review of a website on a surface or platform might not be a complete and accurate picture of an advertiser’s use of dark patterns supplies.

Panel 5: “How can we best approach dark patterns further ?: Possible strategies for dealing with dark patterns”

In another workshop, panelists discussed the current regulatory and enforcement issues related to dark patterns, prioritizing efforts to combat dark patterns, and various solutions to mitigate the harmful effects of dark patterns on consumers. Notable members of this panel include Laura Brett, director of the NAD, and Jennifer Rimm, assistant attorney general for the Office of Consumer Protection in the Attorney General’s Office for the District of Columbia.

The panel noted that there has recently been increased interest in regulating dark patterns. The FTC is actively working to combat these unfair practices by taking enforcement action under the FTCA and laws such as the Restore Online Shoppers Confidence Act, which requires subscription plan sellers to disclose all material terms and conditions and provide an easy way to cancel. Policy makers have also expressed their willingness to ban dark patterns in upcoming and recently enacted data protection laws. California’s Consumer Protection Act, which went into effect in 2020, was the first data protection act to specifically prohibit dark patterns in opt-out processes. The law gives consumers the right to stop, access, and delete their information online, and prevents advertisers from engaging processes designed to affect a consumer’s decision to opt out.

Federal appeals courts have also assisted the FTC in cases alleging dark patterns were used to mislead consumers. For example, the Second Circuit in FTC v LeadClick Media found that the defendant’s website, which sold colon cleansers and weight loss products, violated FTC law, which contained fake customer testimonials from the company and promotional content designed to that they looked independent of journalism.

The attorneys general also recently filed consumer protection cases with dark patterns. One such case was filed against Instacart in the District of Columbia for adding a standard 10% fee that was supposedly intended as a tip for the driver, when in fact it was charged by Instacart. A similar lawsuit, currently pending in the District of Columbia, has been brought against Marriot Hotels for not including a mandatory amenity and resort fee in the advertised price of their hotel rooms and for using a number of other allegedly misleading design strategies that obscured the fee.

The panel also noted the potential for independent self-regulation, including NAD, to deter dark patterns. One panelist noted that, as with other marketing practices, robust FTC enforcement promotes better self-regulation. Strong FTC enforcement motivates advertisers to adhere to NAD recommendations and to face NAD challenges against competitors who use unfair practices to shape the competitive landscape in their favor.

Advertisers should be aware that using dark patterns exposes them to risk from FTC enforcement, NAD challenges, and other legal obligations. In addition to these legal risks, once dark patterns are uncovered, they can quickly undermine consumer confidence and goodwill, and lead to long-term losses. Advertisers should take steps to avoid practices that might be viewed as dark patterns. Advertisers include:

  • Make sure that options to cancel subscriptions aren’t hidden.
  • Avoid practices that would surprise a consumer. Practices to avoid include:
    • automatically adding items to a user’s shopping cart without customer consent;
    • Covering up unexpected charges;
    • Disguise advertising as part of the regular content.

To the extent that online platforms use AI systems as part of the design process or to create marketing collateral, advertisers should also proactively review the material to ensure there are no dark patterns.

[View source.]

Comments are closed.