• Privacy

Privacy by Design

read

In his seminal 1965 book, Unsafe at Any Speed: The Designed-In Dangers of the American Automobile, Ralph Nader accused American car manufacturers of not only being reticent to introduce safety features, but purposefully designing in features that created unsafe conditions. The Chevrolet Corvair, which Nader called “a one car accident,” became the poster child of this “designed-in” lack of safety.

Nader’s advocacy led to the introduction of automobile safety regulations – most famously seatbelt requirements – that were roundly unwelcomed by the auto industry.

Until safety was recognized as a competitive advantage.

Volvo for example, made safety their core brand. A competitive differentiator. (Volvo having invented the 3-point seat belt, gave their patent away in 1959.) The journey from regulation to competitive advantage became true of other mandates like MPG ratings, crash ratings, and other safety ratings, which are also now key selling points. No longer compulsory functions of regulatory compliance, they are features and value-adds. There is every indication that data privacy is best viewed in this light.

Data Privacy is the New Competitive Advantage

It is already the case that an organization’s data protection (i.e., cybersecurity) and data privacy posture (increasingly viewed through a social advocacy lens by consumers) can greatly enhance or devalue its brand and have a real dollar impact. This will surely only increase over time.

While the passage of proposition 24 ― the California Privacy Right & Enforcement Act (CPRA) ― details monetary damages for companies whose data use runs afoul of the new regulations, even the perceived unethical or exploitive use of consumer data is likely to become increasingly damaging to brand and revenue.

Given the sophistication and relentlessness of attacks by state-sponsored hackers, consumers may forgive a data breach, particularly if the response is handled well. And companies, cognizant of their own vulnerability, do not comment on another’s misfortune in this regard. But exploitation of consumer data and crossing that “creepy line?” This is a self-inflicted wound. One that consumers are increasingly less likely to countenance or forgive. And it is one that competitors will be happy to exploit:

“When Google and Facebook — two of Apple’s largest competitors — were under fire for exploiting customer data, CEO Tim Cook saw an opportunity to turn privacy into a competitive advantage.

The tech giant rolled out a suite of new privacy-maximizing features…[and] updated its privacy page to better showcase how its flagship apps are designed with privacy in mind. This doubling down on privacy took center stage in the company’s marketing campaigns, too, with “Privacy Matters” becoming the central message of its prime-time air spots and its 10,000+ billboards around the world.”

―Andrade-Walz

Walz’s article is aptly named “Privacy is the New Competitive Battleground.“

We are at an inflection point in the U.S. concerning the collection, retention, use, and monetization of consumer data. In just two years we have seen: the introduction of the California Consumer Privacy Act (CCPA); the passage of Proposition 24 (CPRA); significant (and market disruptive) privacy features introduced in Apple’s iOS 14; the Facebook-Cambridge Analytica scandal; Facebook’s introduction of Limited Data Use (LDU) and adoption of the option of acting as a “service provider” (pursuant to the CPRA definitional constraints); and Google announcing that it will no longer accommodate third-party cookies effective 2022 disrupting what is widely considered the most lucrative of all digital marketing schemes; and myriad high-profile data breaches.

What was once the sole province of digital marketers, policy wonks, and legislators, has now become highly relevant to consumer opinion. And it is impacting engagement and spend.

The Dollar Value of Data Ethics

The concept of “data ethics” and its value was a topic of discussion across several panels held during WireWheel’s 2020 SPOKES Conference. It is evident that data privacy which also entails consumer control of data goes well-beyond base-line regulatory compliance and includes more abstract concepts like doing the right thing: aka, ethics.

“Data ethics [is] an exciting emerging area. And there are certainly more and more laws and regulatory proposals in this area…But even in the absence of laws, it’s really critical for companies to be thinking about the ethical implications of their data practices as part of their comprehensive data governance strategy.”

―Lindsay Finch, Salesforce Executive Vice President Global Privacy and Product, Legal

Fortunately, consumers do not have to rely on corporate altruism to advance ethical practices, but rather the more compelling ― and perhaps less open to interpretation ― forces of market demand.

As Lindsay reminds, “We know that no matter how good a technology is, people won’t use it unless they trust it. And so, developing technology, with not only privacy, but…around human rights and civil liberties, are really, really important…[it] is the right thing to do.”

Mastercard Chief Data Officer, JoAnn Stonier addressed the economic impact of privacy during a recent International Association of Privacy Professionals (IAPP) web conference.: “We are talking about the economic value of privacy. We are talking about the intrinsic value of privacy. We are talking about what does it mean to your employees, your customers…” (Privacy Made Positive: Evidence that Better Privacy Delivers Enterprise Value, 2020)

And it is becoming ever more important to consumers. Particularly among Millennials who are set to be on the receiving end of the largest wealth transfer in modern history, inheriting and estimated $68 Trillion from their parents by 2030. But Millennials are not the only cohort that cares about data privacy. Mastercard’s Stonier notes that it is not a generational issue, but rather a function of evolving buyer personas regarding data privacy.

In The New Imperative for Corporate Data Responsibility (KPMG, 2020) a survey of 1,000 consumers found the following:

  • 97% say it is important, while
  • 87% say data privacy is a human right.
  • 68% don’t trust companies to ethically sell personal data.
  • 56% think companies should prioritize giving consumers more control, while fully
  • 84% are open to legislation giving consumers more control over their data. However,
  • 91% indicate that corporations should take the lead in establishing corporate data responsibility.

These final two data points seem to be offering business the opportunity to take the lead. And if they don’t? Consumers, through spend and regulatory support, will.

The KPMG report goes on to note that “Consumers are least trusting of how companies will protect their website browsing behaviors…” and according to KPMG Principal, Cyber Security Services, Orson Lucas, “The easier companies make it for consumers to keep tabs on how their data is being used and protected, the easier companies will find it to build consumers’ trust. This is increasingly resonating with our corporate clients…”

Trust, or lack thereof, has financial consequences.

Economist John Llewellyn of Llewellyn Consulting details that 32% of consumers are what he calls “privacy actives,” meaning they will “change providers, or stop using products or services as a direct result of data policies.” He further contends that “consumers see privacy as part of the brand which can translate into increasing your customer base by 50%” (IAPP Web Conference, 2020).

A Consumer Report study surveying more than 5,000 adults in the U.S. has remarkably similar findings to both KPMG’s report and Llewellyn’s findings across the board and also demonstrates a willingness among consumers to pay more for higher levels of data privacy.

Clearly, marketing departments need to view data privacy as more than just Ad Tech constraints to be “managed.” The evolving consumer requirements for data privacy need to be rewritten into every B2C interaction. Forward-looking companies are already operationalizing data privacy beyond the baseline of regulatory compliance and purposefully designing it into every communication, product, and service. It is, for the business and consumer alike, “the right thing to do.”

It is privacy by design.

Data Privacy as UX

“[Data privacy] has become a strategic portion of many businesses. It is how they are thinking about engineering their products, it is how they are thinking about creating opportunities and to disadvantaged competitors, but also to build something that consumers want to use. These are all a very transformational shifts…”

―Susan Rohol, Senior Vice President and Chief Privacy Officer, WarnerMedia

Yet, “There isn’t enough attention being paid to user experience,” says Andy Dale, General Counsel and Head of Strategic Partnerships at Alyce. And data privacy is an important consumer touchpoint that can enhance, or add unwanted friction to, the user experience.

As an example, Consumer Reports Policy Analyst Maureen Mahoney points out that “It’s been really frustrating to go to a site, [like] Amazon or Spotify and try to opt out. They’ll say, ‘well, we don’t sell your data, but we actually share it with advertisers to continue to show [you] ads.’ That’s causing some cognitive dissonance. That feels frustrating.”

Designing and delivering winning user experiences around privacy in products and services is a significant challenge. One that will require collaboration across much of the enterprise (think software development, marketing, product development and UX design; operations, InfoSec, and IT; and of course, legal and compliance. “[It’s] a complex technology ecosystem…[and] requires a cross functional group all the way up through the leadership team to have a serious strategy discussion” says Andy Dale, General Counsel and Head of Strategic Partnerships at Alyce.

The necessary collaborations will likely require partnership with outside solutions providers to operationalize this new approach. As Marc Zwillinger, founder of DC Law Firm Zwillgen, points out: “Our client’s don’t have the ability to pass the [opt out signals or permissions] on a consumer-by-consumer basis…and I’ll give a shout out to WireWheel for having a technology solution here, [that enables businesses] to adopt management platforms that will allow them to do that…” in a consumer-friendly way.

The Enlightened Self Interest of Privacy

In advice reminiscent of “The New York Times Rule,” admonishing that a person should not say or do anything in public or private that one wouldn’t want reported on the front page of a newspaper, Alistair Mactaggart cautions:

“What I keep on saying to businesses is, look, if you’re doing stuff with people’s data that if they found out about, they’d be totally fine with, then you have nothing to worry about. But if your business model is doing stuff with people’s data where if they found out about it, they would recoil and [respond] ‘What?! you’re doing that?!,’ then I think you should adjust your business model…”

Sage advice from the driving force behind California’s highly popular data privacy initiative that culminated in the passage of Proposition 24. Mactaggart really is to data privacy what Nader was to auto-safety. And if the adage that “how California goes, so goes the country” holds true, his advocacy will indeed have as profound an impact.

His advice not only captures the privacy zeitgeist, but an important strategic choice. Companies can choose to approach data privacy from a minimalist “regulatory compliant” approach, or the more maximalist approach ― call it enlightened self-interest ― of privacy by design.

There is a lot of good will, enhanced user experience, and dollars that follow on the line. It is a choice between the Corvair and the Volvo. And likely portends similar outcomes.

Integrate your privacy program directly into your developers’ workflow with WireWheel’s Privacy Operations Manager and integrations.

Request Demo

Watch Webinar Replay: Impact of Privacy Regulations on Social Media & Retargeting