• Privacy

Dark Patterns and Privacy by Design: A Delicate Balancing Act

read

I think we’re watching builders and developers understand that part of the product success is that it has a privacy by design component to it and privacy was thought about upfront and built into the greater feature-functionality. or production system that’s being built.

—Pedro Pavón, Facebook

Of all the debates “privacy” elicits and continues to elude agreement, the value of privacy by design as a winning go-to-market strategy has achieved broad consensus remarkably fast. As Pavón puts it, “I think we’ve moved from we need to incorporate privacy compliance to if we don’t build our products with privacy in mind nobody’s going to be interested in them. It’s a feature.”

However, the conversation does not end there. How organizations – through their “builders and developers” – choose to implement privacy is a new and complex source of debate. A debate that draws on far ranging (and far from settled) concepts from UI/UX design, behavioral sciences and psychology, to decades-old regulatory perceptions of false advertising, unfair, and deceptive practices (15 U.S. Code § 52) and its modern-day equivalent: Dark Patterns.

Enjoining the debate, Andy Dale, General Counsel and CPO of gifting platform Alyce was joined by Pedro Pavón, and Sarah Barrows at the recently held semi-annual Spokes Privacy Conference. Pedro is Facebook’s Director of Ads and Monetization Privacy and Fairness Policy. Sarah is Senior Director, Product, Privacy & Policy Counsel for martech platform NextRoll.

Privacy, or Deception, by Design

The 2018 report, Deceived by Design issued by The Consumer Council of Norway (Forbrukerrådet) spotlighted the use of dark patterns detailing the use of “privacy intrusive default settings, misleading wording, giving users an illusion of control, hiding away privacy-friendly choices…and choice architectures where choosing the privacy friendly option requires more effort for the users.”

The watchword here is choice. The choices made by product developers, web, graphic, and UI/UX designers, and others involved in the product development cycle. The choices made here will manifest, in the best case as privacy centric. In the worst case, as dark patterns. This is not to imply that sub-optimal design in the context of consumer privacy is per force a dark pattern. And, as our panelists discuss, the line between ethical, persuasive design and dark patterns can be difficult to locate (and the subject of much debate).

Privacy-centric organizations are keenly aware of this and integrate their privacy teams into product development early in the process to ensure that design decisions are considered from a consumer privacy perspective:

Is this what a consumer would expect? Would they understand this language? If we’re making choices between making the button blue or grey, are we going to test that? Do we assess that we’re being transparent in the choices that we present to people? That we’re actually carrying out the intention of the language [presented to] our customers or their end users?

—Sarah Barrows, NextRoll

This has quickly become standard best practice for leading brands and publishers. As Pavón says, “Five or six years ago executives…talked about privacy in the context of risk to the business (i.e., data breach risk)…it’s almost turned completely around, and senior executives are talking really loudly about the investments being made in their privacy programs to differentiate themselves. Give themselves a competitive advantage.”

Delineating Dark Patterns: Who Says?

We define nagging as a minor redirection of expected functionality that may persist over one or more interactions. Nagging often manifests as a repeated intrusion during normal interaction, where the user’s desired task is interrupted one or more times by other tasks not directly related to the one the user is focusing on. Nagging behaviors may include pop-ups that obscure the interface, audio notices that distract the user, or other actions that obstruct or otherwise redirect the user’s focus

UXP2.com

Pedro raises a fundamental concern: “How we’re defining dark patterns? How we decide what is and what isn’t a dark pattern.”

Consider “nagging.” A scenario where a consumer may make a decision that is in their privacy interest but may be in some way contrary to the company’s best interest is now repeatedly “nagged” to change their mind.

“How much nagging is a dark pattern?” asks Pavón. “Is it zero nagging? Is it showing a prompter once a year? once a week? Once a day? Once an hour? Am I nagging you by just refreshing your preferences once a year? I don’t know the answer and I don’t think privacy experts are the ones to make those decisions.”

Pavón posits that “we have to leave those up to people who understand human behavior in the ways that dark patterns manipulate humans.”

But the idea that experts who can intuit what is best for the individual consumer (or are not subject to the same cognitive biases themselves) is fraught with challenges and harkens back to arguments made in “Libertarian Paternalism Is Not an Oxymoron” (Thaler and Sunstein, 2003). It is Pavón himself who elucidates the issue:

That’s going to be really hard. And different companies can come up with their own construct, but those constructs are going to be in light of their businesses. NGOs can come up with their own too, but those are going to be in the context of their geography, or their constituency. Governments can do it too, but that’s also going to have a sociopolitical component to it. So…who do we defer to you to make these decisions? Is it scientists, social scientists, mental health professionals? I don’t know.

—Pedro Pavón

Wherever one lands in this debate, Pavón is likely correct that the first thing we need to do is come up with some well-established definitions of dark patterns. Perhaps, as with most things in a marketplace, it is the consumers themselves who will arbitrate.

Where is that Line?

All products are trying to encourage you to buy them, keep them, renew them, all those things, and so, at what point are you engaging in an unfair practice that manipulates or taking advantage of the consumers unfairly or deceptively?

—Sarah Barrows, NextRoll

 

“Let’s take apple’s new example from iOS 14.5,” opines Barrows. “If it says ‘Allow this App’ to track, do you need to have a Facebook message that first tells you this is how we monetize our content and gives you more context? Are both of those companies trying to tip the scale and manipulate your emotional response…[with] what kind of information they’re putting in front of you? I think we’re entering a phase of dark patterns, where it’s murkier and more complicated.”

 

“There are some dark patterns that on their face are objectively bad,” says Pedro. He notes that lying to influence a choice does not suffer from subjectivity. “Another one is hiding information. Not telling the full truth about something. You don’t have to be an ethicist to figure out that problematic. For example, we don’t use your data for ‘X,’ and then you use it for ‘X.1.’” But even here there is room for doubt.

Barrows points out that in the context of “full truth,” when it comes to execution, things can become subjective and potentially problematic:

Consumers who go to the website to read the article or buy the shoes are not there to learn about the data collection practices. There could be a nefarious reason why they’re [disclosing in on way and not another], or it could be ‘gosh, would the average person understand the differences between usage or these types of disclosures?’

“Do you provide a wall of text, or the general idea behind it, to help them make more informed decisions? I think that’s exactly the balance that a lot of us are looking at right now.

—Sarah Barrows, NextRoll

 

How Can CPOs Advise the Business?

“We’ve raised a lot of interesting points here,” notes Dale. “We don’t have regulation. We don’t have a clear definition of what a dark pattern is. And we’re not the right people to analyze it. So, my question is: How, then, do we do product counseling. How do we advise the business?”

There is as much debate on how to implement data privacy – from consent and preference management and PIAs, to UI/UX design – as there is about such things as dark patterns, ethical use of consumer data, and the concept of privacy itself. The choices brands and publishers make from basic compliance to implementing privacy at scale, will differ. If there is agreement, it is this: privacy is not a bolt-on. It must be designed-in and demonstrable to be effective.

And to enable this, the CPO must be in the room. Every room.

The similarity across all companies of varying size is that you have to set the culture of privacy. And that starts with the privacy team itself…to make sure that it’s a norm and that privacy is thought of, and thought through, and something that’s of value.

—Andy Dale, Alyce

 

At the end of the day, privacy must be woven into a company’s fabric. It is now a core part of the product. Indeed, a core part of the brand. Implemented in a consumer-focused way it becomes competitive advantage. The alternative courts diminished return.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo

Watch the entire SPOKES session here.