• Privacy

Dark Patterns in Data Privacy

read

The term “dark patterns” was coined by Harry Brignull in 2010. Brignull, who has a PhD in Cognitive Science, defines dark patterns as “tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something” (darkpatterns.org).

While originally focused on digital advertising practices, “Dark patterns gained notoriety in privacy policy circles in June 2018 when the term was featured prominently in a European report called ‘Deceived by Design’” (Savage, 2020). This report was issued by The Consumer Council of Norway (Forbrukerrådet) after their analysis of settings in Facebook, Windows 10, and Google to demonstrate how dark patterns are meant to manipulate users.

The findings include privacy intrusive default settings, misleading wording, giving users an illusion of control, hiding away privacy-friendly choices, take-it-or-leave-it choices, and choice architectures where choosing the privacy friendly option requires more effort for the users.

Deceived by Design (page 3)

Dark patterns became further highlighted by the varied implementations of consumer consent and preference management platforms – from the privacy-centric to the “deceptive” – in response to the GDPR and then the CCPA. So much so that the CPRA amended the CCPA definition of consent to include “…agreement obtained through use of dark patterns does not constitute consent” (CPRA Section 1798.140 (h)).[1]

While the regulatory intent seems clear, distinguishing between “dark patterns” — and the intentional deception and consumer harming obfuscations denoted – versus “bad” UX or long-standing fair and reasonable marketing and sales practices, may not be as clear. How exactly does one define “dark” or “pattern” such that a regulatory authority can opine reasonably and consistently? More importantly, how can businesses gain clarity to better manage their consumers interactions in this regard?

Deception or Persuasion

Brignull, offers 12 categories of dark patterns. They range from the egregious to the relatively benign. The “Roach Motel” – the ability to very easily get into something (like a premium subscription) but is then all but impossible to get out of – is an egregious example. As newly appointed FTC Commissioner Rohit Chopra wrote it is “the online successor to decades of dirty dealing in direct mail marketing.”[2] Dark patterns like the “Roach Motel” indeed predate the digital age. Just ask anyone who ever had a Columbia House record subscription.

“Bait and Switch” is another example of egregious dark patterning that predates the digital era. Although “like all things digital, dark patterns have no geographic or physical limitations, and consequently, can deceive people on a far greater scale” (Reicin, 2021).

But “Confirmshaming” – which is defined as using wording that seeks to shame the user into compliance – seems definitionally problematic. The example provided on Brignull’s darkpatterns.org is a “Join Amazon Prime” page where the optout is worded “No thanks, I don’t want Unlimited One-Day Delivery.” Should the fundamental sales technique of reiterating the value of an offer now be considered nefarious. Is this a “trick” (read deception or obfuscation), or a rather simple and direct attempt to persuade the prospective customer to buy?

Creating a sense of urgency or scarcity are also claimed as dark pattern techniques. Clocks counting down the seconds left to claim a discounted price; the number of people looking at the same Airbnb property or looking to book the flight you are researching, are offered up as manipulative dark patterns. Similarly, stock counts – only three left! But absent deception (or its first cousin, obfuscation with the intent to mislead), should these be considered dark patterns? If these claims are true, they are effective sales techniques and provide valuable information. If they are deceptive (e.g., there are actually more than three items in stock), then they are perforce illegal.

Contrast this with Uber’s ghost cars. As Commissioner Chopra’s notes, this gives “customers the false impression that they would not need to wait long for an available ride – harming both their riders and their competitors.”

Uber’s response to the driver that noted these maps were not accurate depictions of availability: “…I know this seems misleading to you but it is meant as more of a visual effect more than an accurate location of drivers in the area. It would be better of you to think of this as a screen saver on a computer” (Knibbs, 2015). This is an archetypal use of visual design to mislead/deceive/obfuscate (choose your term).

Is Existing Regulatory Guidance Sufficient to the Task?

The FTC, as evidenced by the Commissioner Chopra’s statement, is taking a deep interest in dark patterns and their potential to harm both consumers and competitors. In fact:

The Federal Trade Commission (FTC) last week [4/29/21] held a workshop where researchers, academics, consumer advocates, and lawmakers discussed the deceptive design practices known as dark patterns and explored regulatory strategies for protecting consumers from them. Speakers warned some of these practices may violate consumer protection laws, and Sen. Mark Warner, who has cosponsored dark pattern legislation, said he believed the FTC possessed authority to prohibit certain types of dark patterns.

DeGeurin, 2021 (emphasis added)

However, are we, yet again, at another juncture where advancement in technology has outpaced the law and regulatory guidance? Perhaps, perhaps not.

The 1980 FTC Policy Statement on Unfairness notes that “the task of identifying unfair trade practices was…subject to judicial review, in the expectation that the underlying criteria would evolve and develop over time. As the Supreme Court observed as early as 1931, the ban on unfairness “belongs to that class of phrases which do not admit of precise definition, but the meaning and application of which must be arrived at by what this court elsewhere has called ‘the gradual process of judicial inclusion and exclusion.

“By 1964 enough cases had been decided to enable the Commission to identify three factors that it considered when applying the prohibition against consumer unfairness.” Namely: 1) the practice is injurious to consumers (which gave consumers the same standing as competitors), 2) Violates established public policy, and 3) The practice is unethical or unscrupulous.[3]

Critically, the policy statement goes on to note that every consumer injury under these criteria are not necessarily “legally unfair” and consequently applies three tests: The harm must be substantial (i.e., monetary harm), must not be outweighed by any countervailing benefits to consumers or competition, and “it must be an injury that consumers themselves could not reasonably have avoided. Importantly, it does not have to be intentional.

Given the illegality of deceptive advertising practices, does the existing guidance provide the necessary framework? “Policymakers should resist the urge to legislate around the concept of dark patterns – it’s not necessary, given existing authorities” (i.e., Section 5 of the FTC Act). “In addition, new laws could undermine legitimate companies’ best intentions to educate their customers” (Leduc, 2021).

Privacy by Design or Deception?

Where is the line between ethical, persuasive design and dark patterns? Businesses should engage in conversations with IT, compliance, risk, and legal teams to review their privacy policy, and include in the discussion the customer/user experience designers and coders responsible for the company’s user interface, as well as the marketers and advertisers responsible for sign-ups, checkout baskets, pricing, and promotions. Any or all these teams can play a role in creating or avoiding “digital deception.”

Eric Reicin, President & CEO BBB National Programs, 2021

The line between ethical, persuasive design and deceptive dark patterns may be difficult to precisely define. There is grey area, and the grey area is always problematic from a regulatory perspective. But for marketers and CPOs, perhaps precision should not be the ultimate goal. It is a line that deserves a wide berth. Getting close to it is more than just creepy, and crossing it is, under FTC guidance, potentially misleading. Staying clear of it – as any consumer-centric organization will want to do – as Reicin opines, will require participation across the organization from legal to marketing to UX designers.

Chopra expects the FTC “to methodically use all [their] tools to shine a light on unlawful digital dark patterns.” This could be re-imagined as call to action for CPOs and marketing teams. To take a consumer-friendly approach in this regard is not unwise.

The value of this is presaged by the positive outcomes (i.e., increased customer loyalty and increased consumer spend) for those firms that proactively adopted a privacy-by-design approach and re-engineered any non-privacy-friendly interactions. Consumers value honesty and transparency. Why wait for Chopra.


[1] The CPRA Section 1798.140 (h) states: “Consent” means any freely given, specific, informed, and unambiguous indication of the consumer’s wishes by which the consumer, or the consumer’s legal guardian, a person who has power of attorney, or a person acting as a conservator for the consumer, including by a statement or by a clear affirmative action, signifies agreement to the processing of personal information relating to the consumer for a narrowly defined particular purpose. Acceptance of a general or broad terms of use, or similar document, that contains descriptions of personal information processing along with other, unrelated information, does not constitute consent. Hovering over, muting, pausing, or closing a given piece of content does not constitute consent. Likewise, agreement obtained through use of dark patterns does not constitute consent.

[2] Statement of Commissioner Rohit Chopra Regarding Dark Patterns in the Matter of Age of Learning, Inc. Commission File Number 1723186 September 2, 2020

[3] Appended to International Harvester Co., 104 F.T.C. 949, 1070 (1984). See 15 U.S.C. § 45(n).