• Privacy Law Update

Privacy Law Update: August 15, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!

Newsworthy Updates

Federal Data Privacy Legislation: Differences With State Laws Raise Preemption Issues

For over two years now businesses have been dealing with the complexities of compliance with the California Consumer Privacy Act (CCPA); the nation’s first comprehensive consumer privacy law. Compliance became more complex with the enactment of comprehensive consumer privacy laws in Virginia, Colorado, Utah and Connecticut, plus the new California Privacy Rights Act (CPRA), a/k/a CCPA 2.0.  As a result, industry has been screaming for one, consistent federal standard. Congress may finally be answering the call with the introduction of the American Data Privacy Protection Act, H.R. 8152, (ADPPA). The ADPPA in its current form would preempt most, but not all, state privacy and data protection laws.

Republican FTC Commissioner Noah Phillips to Step Down

Noah Phillips, one of two Republican commissioners at the Federal Trade Commission, is set to leave the agency in the fall, he told POLITICO.  Phillips’ departure comes at an extraordinarily high-profile moment at the agency, one marked by a heightened skepticism toward corporate consolidation and tension between the Republicans and Democrats on the commission under Chair Lina Khan, a progressive antitrust hawk who has targeted the tech giants and corporate concentration across the economy.

FTC Deepens ‘dark Patterns’ Investigation

Business Insider reports the U.S. Federal Trade Commission is moving forward with its investigation into alleged use of “dark patterns” by Amazon in its Prime services promotions. The agency sent subpoena letters to current and former Amazon employees as it seeks details on the potential deceptive and manipulative practices the company used to amass and maintain Prime memberships.

Final Decision on Meta’s EU-US Data Transfers Delayed

Objections to the Irish Data Protection Commission’s order to halt Meta’s EU-U.S. data transfers will delay a final decision, Politico reports. A DPC spokesperson said fellow data protection authorities raised concerns during the mandated four-week consultation under Article 60 of the EU General Data Protection Regulation and it may take months to resolve the discrepancies. If issues go unresolved, the Article 65 dispute resolution mechanism will be triggered.

Privacy Legislation

Federal Privacy

FTC Privacy Rulemaking: On Thursday, the Federal Trade Commission issued an Advance Notice of Proposed Rulemaking (“ANPRM”) on “Commercial Surveillance and Data Security” by a 3-2 vote. The ANPRM triggers a 60-day public comment period that will constitute a public record that the Agency will consider in determining whether to proceed with rulemaking. On September 8th, the Commission will host a virtual public forum on the proposal.

ADPPA: With legislative attention focused on the Inflation Reduction Act and August recess upon us, we have not seen any major public indications of progress on the American Data Privacy and Protection Act (ADPPA) from key policymakers. The most recent update is an August 4th report from Axios in which a spokesperson said that E&C Chair Pallone “is continuing to build broad bipartisan support and incorporate feedback from members, and is committed to seeing comprehensive national privacy protections signed into law.”

State Privacy

Colorado: The Colorado AG’s comment period for Colorado Privacy Act pre-rulemaking considerations closed on August 5. The AG is posting comments that it received here.

Montana: Sen. Kenneth Bogner (D) has submitted a bill draft request (LC0067) to state legislative services with the short title “Generally revise laws related to privacy and facial recognition technology.” The request is for the 2023 legislative session and it is unclear what scope such legislation may take.

New Jersey: S.332, a narrow privacy bill that would require websites to post privacy notices and honor opt-outs of sales (“monetary consideration”) was amended by Senate Majority Leader Ruiz (D) to clarify that the bill does not create a private right of action. This legislation was introduced in January by Senators Singleton (D) and Cody (D) and was reported from committee on a 3-2 vote in June.

Oregon: The Attorney General’s office submitted a draft comprehensive privacy bill (largely informed by the CO and CT privacy laws and a multi-stakeholder workgroup) to state legislative counsel. The AG’s office intends to move the bill in the 2023 session.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Marketing
  • Privacy

Customer Loyalty, Privacy, and Data Governance

Customer loyalty programs are the backbone of many companies, but they come with a host of data privacy traps, particularly with regard to the new state regulations which have the collection and use of data to effectuate these programs squarely in their crosshairs.

For a discussion of loyalty program privacy risks and opportunities, Blueprint Data Strategy Director, Mark Milone, moderated a panel at the 2022 Summer Spokes Technology Conference (held June 22-23).

The Customer Loyalty, Privacy, and Data Governance panelists included Cooley Partner and Cyber Data Privacy Group Vice Chair, Dave Navetta; Bob Seiner, founder of KIK Consulting and Educational Services; and Global SVP Revenue of loyalty technology firm, Annex Cloud, Erin Raese.

Customer loyalty is a hot topic

A Forrester Stat: 89% of organizations are investing in personalization and loyalty is just a really great way to collect the data that you need and to build that relationship with your customer and to deliver personalization.

—Erin Raese, Annex Cloud

Customer loyalty is a much hotter topic now because organizations are looking to deliver more personalized experiences. Consumers want the businesses to know who they are, put their needs first, and to make their lives easier.

“I had a conversation with a grocer last week,” relates Raese, “who had several requirements: ‘Can you create a discount at point-of-sale? Can you create a discount by product? Can you do discounts by this…and by that…?

“Sure, but why? What about the customer experience? What kind of customer experience do you want to deliver?”

You probably have email addresses which is great – that allows you to give them the discounts they were looking for. But what if you knew that Mary was a vegetarian and a gourmet cook?

What kind of experience could you deliver to Mary?

—Erin Raese, Annex Cloud

“And what if you knew that Mary had a husband, who was on a keto diet and a daughter, who had peanut allergies,” continued Raese. “What kind of experience could you deliver then?”

“If you think about it, the grocer could serve up recipes that fit everybody’s dietary needs. Mary could come to their website. They could curate all the ingredients for all those different recipes, and Mary could go click, click, click…and put them in her shopping cart.”

Bumping up against privacy-related issues

“Here we start to bump into privacy laws which are very much in flux right now,” cautions Navetta.

The fundamental question here is – with the regulations as they are today and cookies starting to become less of a viable means to gather useful information– many marketers are starting to think of loyalty programs as another rich field for collecting data.

But, perhaps also, some of the goals of these programs are not loyalty at all but harvesting a lot of personal information for bigger picture revenue goals.

—Dave Navetta, Cooley

“We have to make sure we’re addressing that and balancing out the requirements around privacy laws.”

A primary role of a data governance program is to manage that balancing act,” says Seiner. “Personalization is all about that customer data. Important considerations include ‘do customers know what data you’re collecting about them and how you’re using that data?’ Data governance can play a role in all of those things.”

The role of trust

It starts in trust, and then it’s really the company’s obligation to ensure that they respect that trust and are good stewards of the person’s data… It’s everything. Everybody walks into that experience looking for trust.

—Erin Raese, Annex Cloud

“The basis of loyalty tends to be a two-way dialogue. A two-way value exchange,” proffers Raese.

“There are terms or conditions, and they should be laid out so the customer knows – or should know – what they’re getting themselves into when they join. That they are going to give data and in return getting personalized experiences, recognition, or rewards in exchange.”

But “when you start to use data outside the parameters of those expectations: to collect data…to sell to a third-party…that starts to erode trust,” submits Navetta. “For the privacy conscious, understanding how the data is going to be used and who it is going to be transferred to are important. But it is also reflected in what regulators and legislators are ultimately requiring.”

Is it just for loyalty or some other purpose?

You’re not going to be loyal to somebody unless you trust them. This requires that customers have confidence in knowing how you’re going to handle the data. Collecting only that data you’re going to use is good, but oftentimes other data is collected along the way.”

—Bob Seiner, KIK Consulting

“You need to be able to explain what you’re going to collect, how you’re going to use it…and you have to have a strategy for communicating that back to the customer, avers Seiner.”

But, as Milone points out, “We don’t know all of the uses of the data generated at the point of collection. It could easily become a new use that wasn’t contemplated when it was collected from the customer.

Where companies don’t have proper governance, we see that after the program has been running for some time, someone in marketing realizes how much data they have, how rich it is, and conceives of new ways to use the data.

And that’s when you get into legal trouble.

—Dave Navetta, Cooley

Selling, sharing, aggregating, and de-identifying data

“I think we’re going to start seeing companies gathering first-party data and zero-party data and wanting to supplement it with other data,” opines Navetta.

And that constitutes a sale of data under certain laws even though no money has been exchanged. This goes to transfers and under the CCPA, for example, you have to provide an opt-out.

In addition, the laws are starting to require purpose and use limitations as well, which goes to reasonable expectations of the consumer….and this is where transparency comes into play.

—Dave Navetta, Cooley

That said, “one way to get more flexibility is to normalize the data,” asserts Navetta. To aggregate or de-identify it so it’s no longer ‘personally identifiable,’ and consequently, no longer subject to these privacy laws.

“Do you lose some of the value? This is what you’re always struggling with: the richness in the personalization tends to go away once you strip out the identifying elements.”

Can there be too much transparency?

Hopefully you are using transparency to engender trust, but at what point does transparency become too transparent? You can articulate every conceivable use of the data…and inundate your customer with terms and conditions.

How do you advise a loyalty program to balance transparency with the information the customer actually needs to make a decision about joining a loyalty Program?

—Mark Milone, Blueprint

Loyalty programs, and the use of data around them is becoming a much bigger issue than it was,” states Navetta, particularly due to cookie deprecation.

“The tendency is to be overly broad because, in the U.S. especially, if you’re notifying customers of data uses, you can use the data as stated without much trouble. Now, this new generation of regulations is starting to put more pressure around use limitations and may require an opt-in if you’re going to use information beyond expected uses.”

I think we’re still going to see broad and maybe overly complex notices to cover all bases, but over time – and as regulators start to clamp down – more precise notices that satisfy legal requirements but also engender trust.

—Dave Navetta, Cooley

What customer data, exactly?

“We look at it in three buckets,” says Raese:

  1. The information that you give when you sign up for a program
  2. Tracking your behavior when you’re making purchases or different types of interactions. A lot of the programs today will incent you to interact with the brand is (e.g., hashtag in social media, review writing, and award redemption), and
  3. Progressive profiling. The attempt to get additional information through, for example, surveys about what customers enjoy.

In this context, it’s not necessarily the sensitivity of the data. It’s the big picture of the data. If you collect a lot of data, you start to learn a lot about people from a privacy perspective and that causes issues.

Regulators and legislators look at the aggregation of [PII], and the inferences and insights that companies can get as a result. There are potential privacy violations that arise in those instances.

—Dave Navetta, Cooley

“What we’re starting to see, is the laws around loyalty programs are making it more difficult for companies to be able to achieve what they want to achieve without having to jump through some compliance hoops.”

Balancing the value and risk of a loyalty program

“If you were going to stand up a cross-functional team to help deliver value from a customer loyalty perspective but mitigate the risks who do you think should be on that team,” asks Milone.

All the stakeholders in that data.

You don’t want to have a cast of thousands, but you want to make certain…the right people are involved at the right time for the right reason with the right data to make the right decision.

—Bob Seiner, KIK Consulting

Who has authority and accountability? The organization, not single individuals, explains Seiner. “A lot of organizations still use the term data owner because it’s been built into the language, but more organizations are starting to refer to these people as stewards of the data.”

“We’re seeing that for most of the organizations that are being successful with this, it is coming from the top and it’s throughout the organization,” seconds Raese.

Data governance is the execution and enforcement of authority over the management of data and data related assets. But how are you going to get to the point where you’re executing and enforcing authority over the data?

Start to involve stewardship, which is definitionally the formalization of accountability.

— Bob Seiner, KIK Consulting

Loyalty program best practices

  • If you don’t really need the information, don’t collect it
  • Respect the data and respect your customers
  • Be aware that loyalty programs are on the radar of regulators right now and they are looking to make examples
  • Be aware that the new privacy laws that are coming online surrounding this issue
  • Understand the roadblocks you need to overcome
  • Legal and audit departments are your friends! Work with them
  • Partner with data governance to be sure you’re doing all the things you need to
  • Be purposeful and intentional with data

Listen to the session audio

  • Privacy

Accountable Executives = Accountable Privacy Programs

A Master Class in Establishing an Effective Privacy Program

DOJ Guidance on the evaluation of corporate compliance programs asks three critical questions:  1) is the program well designed, 2) is it earnestly applied, and 3) does it work in practice? Privacy authorities are increasingly adopting a very similar approach to data privacy program governance, implementation, and results.

As Information Accountability Foundation (IAF) president, Barb Lawler noted at the 2022 Summer Spokes Technology Conference (held June 22-23), “We know regulators – from California to Australia, and points in between – – are increasingly, not just interested in – but requiring organizations to prove that their comprehensive privacy programs are operationally effective, aligned with governance strategies, and accountable.”

This means corporate controls binding on even the most senior executives, needed investment across the organization, and a requirement for real-time performance data.

To explore the core elements and architecture of a demonstrable and accountable privacy program, Lawler and IAF Chief Strategist, Martin Abrams hosted Scott Taylor, Chief Privacy Officer of biopharmaceutical giant Merck & Co for a presentation on Accountable Executives = Accountable Privacy Programs which focuses on an in-depth case study of Merck’s program. It is a masterclass in establishing privacy program accountability, effectiveness, and demonstrability.

The Privacy Accountability Timeline

“A lot of folks think accountability came into fashion within the last five or six years, says Lawler, “and that really is not the case at all.”

Timeline for Accountability as a Privacy Governing Principle

“Looking back we can see that accountability as a governing principle for privacy and data protection dates to 1980 and the OECD guidelines,” notes Lawler. “And its first representation in national legislation was in Canada under PIPEDA in 2000 as a core principle organizations must follow. Then the APEC Privacy Guidelines of 2003, and so on, through the GDPR where accountability is interwoven throughout.

IAF’s Abrams and Merck’s Taylor worked together on “the accountability project” (at the time Taylor was CPO of Hewlett Packard), which was followed by Abrams’ work on the Essential Elements of Accountability (2009-20212): “A multi-stakeholder effort that brought together business, regulators, policymakers, academics, and advocates, defining what it meant to be ‘accountable,’ ‘responsible,’ and those elements necessary to actually demonstrating accountability,” says Lawler.

Elements of Demonstrable Privacy Accountability

“At the end of 2008, accountability had very little definition and that was particularly important for cross-border data transfers,” reminds IAF’s Abrams.

Taylor and I were actually meeting with a group of data protection authorities in Europe, led by the Irish Commissioner, and we said, ‘what if we had a global dialogue which really put some definition to what it means to be accountable when one is transferring data?’

The five essential elements came out of this process.

—Martin Abrams, IAF

What are those elements of accountability that demonstrate accountability? Lawler delineates:

  1. Organizational commitment (at the highest level) to fair processing, demonstrable accountability, and the adoption of internal policies consistent with external criteria and established fair processing principles.
  2. Mechanisms to put fair processing policies into effect, including risk-based adverse impact assessments, tools, training, and education.
    “Not check-the-box kinds of activities,” avers Lawler, but integrated, and supported by,
  3. Internal review processes that assess higher risk FIPAs and the overall fair processing program.
  4. Individual and organizational demonstrability and the mechanisms for individual participation “that are framed, defined standardized, and can literally be shown. “Think about some of the documentation requirements we’ve seen more recently in GDPR,” notes Lawler. And finally,
  5. The means for remediation and external enforcement.

These are the metrics an organization can use to describe for the regulator why they should think of them as a responsible and answerable organization, says Abrams.

Merck case study: the strategic framework

You could argue that the ‘what’ is really the same for all of us, but how we implement is very contextual to different companies and industries.

So, this is just one example. It’s not right or wrong. It’s not better or worse. It’s just one example of how we’ve interpreted [these principles] and tried to build them into an internal program at Merck.

—Scott Taylor, Merck & Co.

This strategic privacy framework is a reflection of what we were hearing from the regulators at the time, in terms of their high-level expectations of an accountable organization,” relates Taylor.

Any good program that’s accountable is going to have some type of oversight. The expectation is that all parts of the company that impact personally identifiable information will come together in shared decision-making that looks at both risks and opportunities.

You’re always balancing tensions between risks and benefits.

—Scott Taylor, Merck & Co.

Strategic Privacy Framework

Below that oversight layers are the three pillars that make up the traditional privacy program:

  1. Commitment: The policies and programs need to align external expectations (e.g., regulatory and consumer expectations) and be translated so that management understands it fully and can commit to transparency and accountability. “But I’ve always said that the ‘commitment pillar’ is nothing more than words if there isn’t something to put it into effect.”
  2. Implementation: This is the many different types of mechanisms to ensure those policies and commitments put in place are understandable to employees, and that their effectiveness is measurable both from a compliance and a business standpoint. “But implementation is a bit of a waste if you don’t have a way to validate that it’s actually turned out the way you expected.”
  3. Validation: More than just data indicating the commitment was correctly translated into action, it provides some of the best information in terms of elucidating any gaps you might have so you can continuously improve the program.

These three mechanisms supported by the overarching governance, opines Taylor, form the foundation of demonstrability – to both internal and external stakeholders – of the organization’s privacy program commitment and accountability.

“As simple as it may seem, everything anchors back to it,” says Taylor.

Merck case study: implementation

Accountability starts with accountable people…and for things to be truly accountable, people have to be measured on their success in upholding their piece of accountability.

—Scott Taylor, Merck & Co.

Accountability at Merck begins with the Corporate Compliance Objectives, relates Taylor. The set of objectives senior executives are measured against are very specific in terms of what and how. It is done on an annual basis, and it impacts compensation.

Importantly, cautions Taylor, “If you’re going to have a high-level objective that could impact people’s compensation, then it needs to be structured very well.”

The “binding mechanism” is the Merck Privacy Function Deployment Standard: the standard operating procedures (SOPs) that set out ten (10) elements for which senior executives are accountable. These SOPs detail what, how, and crucially, the tools and resources – including the assignment of Privacy Stewards – to support these efforts.

Supporting, and ensuring privacy is effective across all the businesses, Merck has 209 Privacy Stewards around the world. Each of whom undergo self-assessments against the 10 elements. The stewards spend anywhere from 25% to 100% of their time supporting privacy “in a very specific way.” Taylor details that Merck “provides their privacy stewards with very specific implementation standards and processes.

Privacy Program Flowchart

The privacy implementation network at Merck (at a high level) comprises:

  • The Governance Body (comprising senior representatives appointed by the highest levels at Merck)
  • The Central Privacy Office which oversees all aspects of that framework strategy and end-to-end management supported and held accountable by
  • Critical Partners such as Procurement, Legal, Internal Audit, and Privacy Stewards

All of whom act in concert in a shared accountability with whom the Central Privacy Office maintains continuous bi-directional dialogue. This forms the people strategy, says Taylor. And everyone “operates off the same song sheet. No matter who you go to, you’re going to get the same answer.”

Finally, a program such as this cannot be done manually. It requires well-developed standards, policies, and procedures, the control sets, and crucially, the technology to support it.

It is complex, admits Taylor, and to manage that complexity Merck utilizes assistive technology, “in everything we do. So, “pretty much every requirement has a tool and an underlying process to support it. We’re trying to take manual out of the process and facilitate through workflows and technologies.”

As Taylor emphasizes, accountability is not “words on paper or pretty slides.” Rather it is a carefully architected program requiring the commitment of people, well-designed process, robust toolsets, and the binding mechanisms that transform intent into earnest application that works in practice. As Taylor notes, it is “complex,” but the equation is simple:  Accountable People (+ the processes, tools, and binding mechanisms to support them) = Accountable Privacy. The results, of which there are many, include:

  • Effective governance and oversight
  • Greater visibility into processes
  • Enhanced analytics
  • Data-based decision making
  • Solid metrics and KPIs
  • Enhanced risk identification and mitigation capabilities
  • Improved turn-around times (e.g., PIA, DSARs, ROPA)
  • Improved third-party due diligence
  • Better role-based privacy training
  • Ability to implement timely and continuous improvement, and
  • Transformation from reactive to proactive data privacy operations

The Merck case study truly is a masterclass in establishing privacy program accountability and demonstrability and what a mature program can be. It should be required viewing by all privacy professionals. Don’t wait. Access it here now.

Listen to the session audio

  • Marketing
  • Regulations

Evolution of Consent and Preference Management

The U.S. is really moving away from just that little cookie banner at the bottom to trying to think through all of the different choices you have to effectuate consent.

It’s raising very complex user experience questions.

—Justin Antonipillai, WireWheel

The granularity of evolving consent requirements, differences in definition and requirements across state laws, the added complexities of managing consent across multiple channels, and other factors have certainly placed a heavy burden on the adtech industry, publishers, and brands.

Now, increasing attention is being paid regarding the burden consent and preference management is placing on consumers and the deleterious impact to the user experience.

To discuss the evolution of consent and preference management, how we got here, and where it is going, WireWheel Founder and CEO Justin Antonipillai moderated a discussion at the 2022 Summer Spokes Technology Conference (held June 22-23).

Joining Justin to discuss Consent and Preference Management Across the Globe were BBB National Programs Senior VP, Privacy Initiatives Dona Fraser, and Ruth Boardman Co-Head of the Privacy Practice at Bird & Bird. Boardman is currently on the board of directors of the IAPP, and a member of the UK government’s Export Council, advising it on data transfers.

The evolution of request for consent

The pre-GDPR Cookie Banner was “an overlay and just an invitation to click OK. Very unobtrusive,” begins Boardman:

The Drum's Privacy Policy

“But they are changing. They are getting bigger and giving more choice. This banner [illustrated below] has a choice of accepting all cookies or accepting only essential cookies.”

Cookie Consent Management

“This next example is a good illustration of an approach which came in with GDPR, but which is increasingly being challenged:

“The idea here is that you have a brief overlay on the homepage. Then, if you click through, you bring up the more detailed information where there’s a list of the particular purposes and third parties.

“The choice is to ‘accept’ or ‘manage cookies.’ To say yes to everything or to go into more options (including saying no.)” And as  Boardman observes, the use of color and the complicated process to exercise more control, nudges the user towards accepting everything (what could be called a dark pattern).

While quite common when GDPR became applicable in 2018, it is increasingly being challenged.

One requirement of the GDPR is that it should be as easy to withhold or withdraw consent [Article 7] as it is to give consent. Pressure from privacy activists and from data protection authorities is that this kind of user interface – requiring multiple steps to exercise choice – is arguably unfair, because you’re playing on the subconscious to nudge into accepting.

—Ruth Boardman, Bird & Bird

How these changes played out can be seen in the before and after illustrated below. You can see that Google has moved off ‘agree or customize’ to ‘accept or reject all.’ Notably, the options “are mutually positioned, the same color, and the same size,” notes Boardman. You also have ‘more options’ to exercise more sophisticated control.

The drivers of consent evolution

The evolution of consent management has been a combination of a number of factors:

  1. Most importantly, it’s the law. “It’s a combination of legislation and the ePrivacy Directive” which says that using cookies or cookie-equivalent technologies – in fact, whenever information is stored or retrieved – you need consent unless it is for essential purposes. There is also a requirement for consent, actually dating back to 2011 reminds Boardman, if you’re doing (in broad terms) cross-site targeting. However, what consent means was altered with the GDPR and “that’s what’s driving this evolution.

  2. This legislation has been coupled with regulatory guidance from supervisory authorities including the ICO (UK), CNIL (France), DSK (Germany), AEPD (Spain), and others. All “requiring much more transparency and much more granular user control.”
  3. There have also been a series of cases – some going to the CJEU (e.g., Planet 49) – as well as a series of complaints by the “lobby group” nyob founded by Max Schrems that has been really influential.
  4. Industry guidelines, in particular the IAB transparency and consent framework (TCF) developed by the adtech industry designed to allow adtech participants to prove they meet GDPR obligations by demonstrating consent.

Why cookie banners look the way they do

The reason the evolving appearance of cookie overlays look the way they do are a function of the detailed GDPR consent requirements, says Boardman. Namely:

  • Consent must be specific and informed. “The individual needs to know the particular purposes for which they are giving consent at a detailed level,” such as distinguishing between cookies for analytics or targeted advertising.

Boardman notes that “the TCF goes even further and breaks it down into consent for targeting to display content versus targeting to customize ads versus consent in order to carry out measurements or attribution purposes, for example.”

  • The identity of every party relying on consent must be specified. This is why there are multiple screens to reference your partners and linking to a list that typically includes hundreds of parties.
  • The “consent has to be demonstrable and unambiguous” and requires “clear affirmative action.” A key driver for the move away from the simple banner reading, “’by continuing to use this site…’ which infers consent which does not provide demonstrable proof. And lastly,
  • Consent has to be freely given and revokable without detriment specific to different processing operations; service cannot be dependent on consent; it must be as easy to withdraw as it is to give consent; and it must be separate from other terms.

“Revokable without detriment impacts the ability to have cookie and pay walls,” says Boardman. “There are currently cases pending and on their way to the Court of Justice, looking at if you try and have paywall how much can you charge per month per user before this starts to be a detriment.”

Tough on adtech, tough on consumers too

This has an implication for user experience that can be equally burdensome. There is a burden by not having granular choice, but having granular choice also is a burden to the data subject because the consumer has to look at a lot of information.

Breaking down individual choices and processing in that granular way means that consumers must interact multiple times before they get to what they want to do.

—Justin Antonipillai, WireWheel

“It does impose a burden on the user. But my experience has been that when organizations try to raise that argument… and ask for consent in a lighter touch global way…it doesn’t get a very sympathetic hearing,” opines Boardman. “The response is ‘maybe you shouldn’t do as much intrusive processing’…and to push the challenge back to industry.”

The proposals recently published in the UK pick up on this, says Boardman and “as a first step, proposes that you don’t need to ask for consent for analytics cookies but this is coupled with the requirement that consent won’t be taken out unless and until various well-developed technologies allow users to have that degree of control.

“The difficulty with the current approach is that it has clearly been designed to meet the obligations to prove consent in a way which is very granular,” and it is clearly designed for this purpose and not the user. “That’s the challenge.”

“There seem to be a fair number of assumptions that consumers understand all of this,” opines Fraser. So, we’re putting all these choices in front of them presuming they know what any and all of this actually means.

For me, if it’s a choice about having advertising targeted to me that may actually be a distraction…it’s why I’m on the site in the first place. People process things very differently.

Even for those of us who understand it is overwhelming sometimes to the point where ‘did I just opt-out or what did I just opt-in to. And more importantly, how do I know my choices are even being honored?

—Dona Fraser, BBB National Programs

WireWheel Consents and Preferences Flowchart

The consent and preference infrastructure

Antonipillai proffers that you have to think about technology that allows you to bring in consent and preference signals from multiple channels: not just web or mobile APP,  but connected TVs, cars, and IoT devices.

This means having a way that you can look at a single universal consent and preference solution.

And not only capture those consent signals but prove it and have the record keeping behind it. One benefit from the consumer experience perspective is that by unifying the signal, you gain the ability to move beyond one channel and having to capture it over and over again and begin alleviating the burden on consumers.

But it takes more than just a cookie tool, it takes a central platform to actually look at the choices across your channels and brands.

If the notion is that consent can lead to better customer data information, isn’t that what companies want so they can build that relationship? Build consumer trust?”

But, having that first-party user data – and being able to use it to the best of your ability to build that relationship – also means knowing you have a greater responsibility with that data.

—Dona Fraser, BBB National Programs

It’s still about trust

Fraser notes that most of the companies BBB National Programs deals with are international companies who are trying to create a streamlined process, not just for their users, but for their internal backend systems as well.

If they’re trying to create one website, one mobile APP, that’s doing it all everywhere, knowing that they have to comply with a myriad laws, it’s a huge challenge and a burden. But that said, your organization’s privacy program commitment to privacy and data ethics is the larger question.

If your organization is not first committed to dealing with this on a day-to-day ethics level with transparency, the consent management process isn’t going to work. It’s not going to have the veracity that users need in order to share their data willingly.

The challenge that we are still going to see is explaining to consumers why they’re opting in.

— Dona Fraser, BBB National Programs

“The fact that you just want to browse a website and are faced with these questions and procedures can be an overwhelming experience,” continues Fraser, and “technology may offer a way for us to streamline this, but state laws are going to force our hand. “The problem is the cost of doing business,” she says.

“BBB National Programs tends to work with small to medium sized companies and they don’t necessarily have the resource for dealing with this. They struggle to go beyond just checking the compliance box and look to manage customer relations in another way, but I don’t think companies can separate that anymore.”

Listen to the session audio

  • Privacy Law Update

Privacy Law Update: August 8, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!

Newsworthy Updates

CJEU Rules on Interpretation of EU GDPR Special Categories of Data

The Court of Justice of the European Union rendered a decision clarifying how indirect disclosure of sexual orientation data is protected under Article 9 of the EU General Data Protection Regulation. The court explained such data disclosure falls under the special categories of personal data in Article 9 after consulting Article 4(15) provisions for “data concerning health.” TechCrunch reports on how the decision could have wider implications across a variety of online platforms.

How CPOs Can Protect Medical Data Privacy in a Post-Dobbs America

When the US Supreme Court overturned the landmark 1973 Roe v. Wade decision last month with their decision on Dobbs v. Jackson Women’s Health Organization, it immediately raised the stakes on medical data privacy for individuals and their employers. It also increased the importance of protecting medical data privacy for a wide range of healthcare-related businesses, including: insurers, healthcare providers, and the makers of fitness trackers and wellness apps – and especially fertility tracking apps.

Meta Repeats Why It May Be Forced to Pull Facebook From EU

Meta Platforms Inc. reiterated its warning that it may have no choice but to pull its popular Facebook and Instagram services from the European Union if a new transatlantic data transfer pact doesn’t materialize.   Meta could face an imminent data flow ban from Ireland’s data protection watchdog, which oversees a number of Silicon Valley tech giants based in the country, in a decision that risks impeding transatlantic data flows. The Irish Data Protection Commission could issue its decision on a possible ban of EU-US data transfers under so-called standard contractual clauses in the next three months, Meta said in a regulatory filing.

India Nixes Privacy Legislation

India’s government on Wednesday withdrew a data protection and privacy bill which was first proposed in 2019 and had alarmed big technology companies such as Facebook and Google, announcing it was working on a new comprehensive law.

Privacy Legislation

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy

Data Security vs Data Privacy: Is There a Difference?

Could you sum up the difference between data security and data privacy in a single sentence?

It’s not impossible, but if it were, it would be quite a long sentence. So instead, we’ve written this article, which will explain the difference in some detail.

We’ll look at what’s involved in data security and data privacy, note a couple of key distinctions between them, and give you a few tips for best practice for the implementation of data policies.

What is data security?

Data security – also known as data protection – concerns the prevention of unauthorized access to data. This includes malicious access by third parties such as cybercriminals and hackers, internal employees or contractor abuse, of course. But it’s also about reducing the risk of human error leading to data breaches.

With so many of today’s businesses pursuing a digital enterprise transformation, there’s never been a more important time to get it right. The typical processes used when implementing data security policies include:

  • Multi-factor authentication
  • Access control
  • Network security
  • Data encryption
  • Activity monitoring
  • Data masking
  • Data erasure
  • Breach response protocols

With robust data security and protection strategies in place, your business can ensure that it avoids data corruption, loss or theft. It can help prevent some of the cloud security mistakes we regularly see that can severely damage an organization’s reputation.

What is data privacy?

Data privacy, on the other hand, is about how the information collected by an organization is used. To be more specific, it relates to the collection, processing, storage and handling of data.

Data privacy management is crucial because businesses and other organizations have a legal responsibility to handle information about customers, employees, and other stakeholders in a secure and sensitive way.

The precise details of data privacy law vary from country to country, but it’s standard for there to be some kind of rule against unauthorized access to data or disclosure of personal information.

Neglecting to meet your responsibilities under the law could lead to your business suffering financial penalties or lawsuits. At the very least, there’s a real danger of your organization taking a severe hit to its reputation. And if that happens, it can be very difficult to rebuild customer trust. So it’s vital to ensure that you take all necessary steps to implement rigorous data privacy protocols.

Differences between data security and data privacy

Although the two are related, they are not the same. There are a couple of important distinctions to be drawn between data security and data privacy.

Different aims in terms of safety

Data security places an emphasis on developing processes and protocols to prevent unauthorized access to data by hackers and other cybercriminals. Meanwhile, data privacy is about controlling who is permitted access to data, what are the permitted use cases, and how to make sure personal information is not misused by defining policies and creating appropriate controls.

Data security, in other words, is a prerequisite for data privacy. But it doesn’t necessarily work the other way around. Theoretically, you could have data security without data privacy. For example, an online retailer could have very robust data security protecting transactions on the payment side of their online shop. But without a working ecommerce privacy policy in place, there’s nothing to stop a dishonest employee from selling on customer data.

Who is legally responsible can be different

We’ve mentioned that the laws in this area do vary considerably from place to place. In many cases, though, the question of who is responsible for data security and data privacy may not be as simple as it first appears.

In most cases, the legal responsibility for data security lies unambiguously with the company or organization storing the data. However, quite often the user is expected to take a degree of responsibility for data privacy themselves.

Users have a large amount of control over the decision of how and where to share their data, and this is generally reflected in legislation. Of course, organizations the user has shared their information with will still be expected to have strict processes in place to protect the privacy of data shared with them.

Best practice for data security and data privacy

There are a number of things you can do to make sure your organization is meeting its responsibilities in storing and handling data.

Keep up to date with the law

This is the most important and most fundamental element of any data security policy or data privacy program. Being aware of which laws apply to your circumstances and following them to the letter is vital.

One important aspect of this issue to bear in mind is that laws in other jurisdictions can apply to you if you do business in that jurisdiction. For example, any US organization dealing with customers resident in the European Union will need to comply with the General Data Protection Regulation (GDPR) rules, which came into force in May 2018.

The law surrounding data security and privacy can be very complex, which leads onto the next point.

Hire professional experts

It’s best to have dedicated legal and IT experts to consult on and implement your policies. Ideally, this should happen at the beginning of the process, while you develop your solutions. Many larger organizations already have the skilled staff available for this task, of course, but smaller ones may need to outsource it.

This may seem expensive, but it could cost you a lot more in the long run if you get it wrong.

Don’t collect unnecessary data

It may be tempting to ask users to provide all sorts of data just in case. But generally speaking, this is not good practice. The more data you collect, the more can go wrong with handling it. Collect only the minimum data you need for your purposes.

One added advantage of this is that applying this principle at scale could save you money on bandwidth and storage costs. It’s also more pleasing for users, as it cuts down on extra fuss.

Automate your processes

Whether it’s about traditional PBX phone systems or the most cutting-edge machine learning tools, automating business processes is the perennial efficiency builder. And it applies just as much in this case.

The more of your data security and privacy tasks you can automate, the lower the risk of human error. It’s not always easy for employees to remember all of the compliance rules they have to stick to, so automating as much of the process as possible takes a lot of the burden off them. As a result, you’ll have fewer data breaches and less stressed staff.

Implement rigorous security procedures

There’s a reason why there are so many different types of reports in software testing. Each has a role to play in creating and maintaining standards in the final product. Using an intelligent mixture of software and network access protocols in your organization is key to making your data security setup a success.

Consider safety tools like multi-factor authentication, access control and data encryption, yes – but don’t overlook more basic necessities. For example, do you have robust procedures in place for updating your antivirus software? Do your staff always use a secure private network, without fail?

Even something as simple as social media can catch out the unwary. It’s best to limit the amount of information you share on social sites, since they can be an entry point for malicious actors.

Limit employee access to data

In today’s fast-moving business environment, where we’re using any number of tools like remote working software or an enterprise VoIP system to communicate, it can be easy to get a little careless with this. Too often, information flies from one part of an organization to another without much thought about how it gets there. Or even whether it needs to get there at all.

The fact is, it’s important to give careful consideration to exactly who needs access to data and who doesn’t. Partly, this is a question of our old friend human error again. The more individuals have access to sensitive information, the more likely it is it could be leaked accidentally.

Make consistent decisions about who needs access to data, and monitor that access. Training employees on issues like consent and preference management can also be useful in getting everyone on board with the process.

Get your employees on board

In fact, training is a good idea all round. Organize regular training courses on data security and data privacy so that everyone is aware of the importance of good practice.

It’s best if all employees have a good understanding of your organization’s policies, so that they remain front and center in your minds during everyday working life. Emphasize the importance of reporting any data breaches early to prevent more serious repercussions later.

Final thoughts

We spend a substantial proportion of our work time dealing with data: recording customer contact details, estimating the cost for an AWS instance type, and generating software test results. It’s easy to lose sight of the fact that safeguarding information is one of the most crucial duties of any modern organization as we focus on day-to-day tasks.

Nevertheless, data is one of the most valuable assets we have. Safeguarding it is not only a legal responsibility, but also key to any business’s reputation. So why not take some time today to review your data policies and make sure they’re the best they can possibly be?

Jessica Day is the Senior Director for Marketing Strategy at Dialpad, a modern business communications platform that takes every kind of conversation to the next level—turning conversations into opportunities. Jessica is an expert in collaborating with multifunctional teams to execute and optimize marketing efforts, for both company and client campaigns. Jessica has also written for other domains such as Data Privacy Manager and Guru.

  • Privacy Law Update

Privacy Law Update: August 1, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!

Newsworthy Updates

CPPA says preemption must not be in any federal bill

The proposed American Data Privacy and Protection Act feels so close and yet so far away. The comprehensive privacy bill is on the cusp of a U.S. House floor vote, a first for any federal privacy proposal. But the bill’s fragile nature is being tested at a crucial point in the legislative process as California lawmakers and stakeholders prefer the bill fail or be refit in order to preserve the California Consumer Privacy Act and its successor, the California Privacy Rights Act.

A look at Canada’s new federal privacy legislation, Bill C-27

In June 2022, while privacy professionals in Canada were still contemplating Bill C-26 on cybersecurity, the much-anticipated Digital Charter Implementation Act, 2022 — Bill C-27 — was introduced by the federal government. It is a reintroduction and, some may agree, an improvement of Bill C-11, first introduced in 2020 and failed on the order paper as a result of the federal election in 2021.

China PIPL: Data export regime starts to take form

China’s Personal Information Protection Law lays out strict limitations on cross-border transfers of personal information (PI). Finally, after more than nine months after the PIPL came into effect, three new regulatory developments will provide guidance on the administrative procedures and detailed rules to implement the cross border transfer rules.

Examining the intersection of data privacy and civil rights

For historically marginalized groups, the right to privacy is a matter of survival. Privacy violations have put these groups at risk of ostracization, discrimination, or even active physical danger. These tensions have long pre-dated the digital age. In the 1950s and 1960s, the government used surveillance programs to target Black Americans fighting against structural racism, with the Federal Bureau of Investigation’s (FBI) Counterintelligence Program (COINTELPRO) targeting Dr. Martin Luther King, Jr. and members of the Black Panther Party. During the HIV/AIDs epidemic, LGBTQ+ individuals were fearful that with an employer-based healthcare system, employers would find out about a doctor’s visit for HIV/ AIDS and that individuals would then face stigma at work or risk losing their jobs.

Data privacy is the future of digital marketing: Here’s how to adapt

The world of digital marketing is approaching a new normal. Consumer privacy is no longer just a movement to monitor, but one that is reshaping the industry through regulation and action by the tech giants. Major brands are now coming to realize that the way they organize, invest, think about audiences, and engage with consumers, will be reorganized around people’s privacy preferences – rendering many traditional digital marketing strategies fundamentally different.

Privacy Legislation

Opposition to ADPPA Intensifies
Senate Commerce Chair Cantwell continues to oppose the American Data Privacy and Protection Act (ADPPA), significantly undermining its chances for further progress in this Congress. In comments to The Spokesman, Senator Cantwell stated that “[i]f you’re charitable, you call it ignorance” regarding the House’s approach to privacy enforcement and suggested that civil rights groups supporting ADPPA have “been infiltrated by people who are trying to push them to support a weak bill.” In separate comments to the Washington Post Cantwell expressed that she is not planning to bring ADPPA to a markup because “I don’t even think Nancy Pelosi has plans to bring it up, so pretty sure we’re not going to be bringing it up.”

Senate Commerce Marks up Child Online Privacy & Safety Legislation 
Mere hours after ADPPA advanced from the House E&C Committee, Senator Cantwell called a Senate Commerce markup of two bills, S.3663, the Kids Online Safety Act (KOSA) (Blumenthal (D-CT) & Blackburn (R-TN)) and COPPA 2.0, the Children and Teens’ Online Privacy Protection Act (Markey (D-MA) & Cassidy (R-LA)). At the July 27 hearing, KOSA advanced unanimously and COPPA 2.0 advanced on a voice vote with some Republicans opposing. Notably, Ranking Member Wicker expressed his support for the ADPPA, opposition to COPPA 2.0, and areas for future improvement in KOSA. Other Republican members expressed concern about the scope of FTC rulemaking authority in the COPPA 2.0 bill.

California Privacy Protection Agency Opposes ADPPA in Special Board Meeting
At a special California Privacy Protection Agency meeting on July 28, the CPPA board unanimously adopted 3 motions related to federal privacy legislation:

  1. Oppose the American Data Privacy and Protection Act as currently drafted
  2. Oppose any federal bill that seeks to preempt the CCPA or establish weaker privacy protections
  3. Authorize Agency staff to support federal privacy protections that do not preempt the CPPA or that create a true “floor” for privacy protections that states can build on in the future.

Board members raised the following concerns about the ADPPA:

  • Chairperson Urban: Expressed concern that the ADPPA would cause Californians to lose the privacy rights they currently enjoy today. She further expressed concern about losing the CCPA (as amended by the CPRA)’s constitutional floor, which she called a direct response to industry efforts to weaken the bill.
  • Board Member Thompson: Stated that the ADPPA represents a “false choice” by treating privacy rights as if they are limited in supply and the wrong argument that Californians’ strong privacy rights must be taken away in order to provide weaker rights federally. He further questioned the ADPPA requirement that unified opt-out signals must be authenticated.
  • Board Member de la Torre: Raised concerns that the ADPPA would jeopardize the ability of California to receive a state-specific EU-adequacy determination. She further raised concerns about the ADPPA overruling privacy laws of local municipalities, not just states. She also argued that the preemptive effect of the ADPPA has not yet been fully explored and that it could strike down laws that protect women in the wake of the Dobbs decision.
  • Board Member Sierra: Raised concerns that the ADPPA would limit the enforcement effectiveness of the Agency.
  • Board Member Le: Cited to CPPA Deputy Director of Policy and Legislation Maureen Mahoney’s Staff Memorandum to argue that the ADPPA is weaker than the CCPA because the ADPPA: (1) would deprive Californians of the right to opt out of automated decisionmaking; (2) covers fewer service providers (excluding those that perform work for government entities); (3) does not clearly cover inferences; and (4) requires impact assessments for fewer types of businesses.
  • Executive Director Soltani: Unequivocally stated that the ADPPA would be weaker than the CCPA on substance. He further argued that California’s existing law is better interoperable with other state and international privacy frameworks.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Marketing
  • Regulations

Data Governance, Metadata Management, and Consent and Preference Management Software

Consumer data drives business. However, pressures from regulators and consumers often complicate the data landscape, making it difficult for organizations to collect and use this information. Data governance, metadata management, and privacy software can function independently of one another or together to create a robust technology stack that drives regulatory compliance, consumer satisfaction, and effective marketing.

Data Governance Software

What is data governance software?

Data governance is the process of managing and organizing data so that it is available, usable, consistent, accurate, and secure at an organization-wide level. Data governance serves as a foundation for effective data collection, analysis, and decision-making.

Data governance software is a tool that allows organizations to organize, manage, and protect the sensitive information they collect and want to use to enhance their business. Some data governance tools can also automate audits, optimize workflows, and demonstrate compliance.

What does data governance software do?

Data governance software benefits include:

  • Data visibility and quality – By connecting and integrating information across systems, data governance software tears down organizational data silos that can lead to inaccuracies. Improved data visibility allows for holistic reporting and decision-making.
  • Data security and compliance – Data governance software monitors regulatory updates in order to flag risks and provide actionable steps to achieve compliance in real time.
  • Automation – Automating audits and other data management processes helps organizations run efficient data governance programs while also saving time and money

Who uses data governance software?

In large, enterprise organizations there is likely a data governance office with a data governance team. In a typical enterprise, a data governance team could be made up of data analysts, compliance specialists, and data governance architects.

Metadata Management Software

What is metadata management software?

Metadata management is a subset of data governance that involves the details that describe collected consumer data. This information typically relates to attributes around the data collection: its origin, current location, owner, access controls, and audit trails. Metadata can inform organizations about the value of the content they collect from consumers and accordingly govern the appropriate use of this data.

What does metadata management software do?

Metadata management software can increase visibility and understanding of a company’s data across various teams and systems. Metadata information can help make the decision easier, promoting efficiency and compliance. The software also lets users edit and oversee data categorization. Metadata management tools can simplify data management and retrieval of this information to increase efficiency.

Without metadata management software, companies will struggle to determine what content they have and how it can be used. As privacy legislation evolves and more and more regulation is put around data, metadata management systems are becoming a critical piece of any organization’s data infrastructure.

Who uses metadata management software?

Metadata management software helps data engineers and IT teams who need to constantly manage and interact with data. From sales and marketing tools to internal HR, metadata management is critical in understanding how data can be used in an organization.

Consent and Preference Management Software

What is consent and preference management software?

Consent and preference management software can work within a data governance framework to help define the ways that consumer and user data can be used across an organization. Some of the best examples allow users to see what kind of information is stored, how it is used, and where it is transferred. Consent and preference management often falls into the privacy software space and can also help automate privacy assessments and data requests.

What does consent and preference management software do?

Consumers want to know that the companies they give their sensitive information to will not abuse the exchange. Consent and preference management software can help companies build the infrastructure that companies need to give consumers the confidence they need to manage this consent.

Who uses consent and preference management software?

Privacy professionals, IT teams, and data governance professionals all play a part in the success of a consent and preference management platform. The software is used as a tool to help implement, automate and enforce the usage of data through an organization.

Privacy Laws are Driving Unification Across Organizations

Privacy laws are making it critical for organizations to truly understand and consolidate the data that they are collecting and how they are using it. Data acquisition, creation, storage, and use can no longer be managed on a team by team or business unit by business unit basis and should adhere to centralized guidelines that meet privacy and regulatory obligations.

Combining the power of data governance, metadata management, and consent and preference management enables organizations to unify, regulate, and utilize data without compromising the trust of its users. When these three types of software work together, companies can confidently use consumer data in an effective and reliable manner. Marketing teams can still do marketing, sales teams can still make sales, and consumers can have trust that your organization respects their rights.

  • Marketing
  • Regulations

Consent – Beyond Compliance

Consent management is no longer optional. Law and regulation have made implementation mandatory with several state laws requiring consumers to opt-in/opt-out of targeted marketing. Consent and preference management is top of mind for brands, publishers, and the whole of the adtech and martech ecosystem. The concerns go well beyond compliance.

Two camps seem to be emerging. One camp views these requirements solely as a burden that impedes their go-to-market capabilities, but with which they are forced to comply. The second sees opportunity and is already positioning for what they view as competitive advantage.

To discuss Consent – Beyond Compliance, Ann Smith, WireWheel Director of Demand Generation moderated a discussion with Arnaud Gouachon, Contentsquare Chief Legal Officer, and Kara Larson, 6sense Principal Privacy & Compliance Counsel at the 2022 Summer Spokes Privacy Technology Conference (held June 22-23).

Consent and preference management is a top-of-mind challenge

The overall context in my view, goes beyond the legislation. Legislation may be the consequence or may be the cause, but it’s only one part of the trend that – whether brands like it or not – is coming. As an actor in this ecosystem, you can decide what you want to do with it, but it’s coming.

—Arnaud Gouachon, Contentsquare

“Consent and preference management is top of mind for a lot of our customers,” says Larson. “We get a lot of questions around ‘what do I need to do for a cookie banner, and how do I allow this to fire?’ We’re also starting to see a lot more questions about consent, especially with respect to the CPRA, and based on the just released draft U.S. federal privacy bill (ADPPA).”

The first question 6sense is asked today is about straight-up compliance. Clients want to understand how to use a product and remain compliant, says Larson. Secondly, they are asked questions concerning compliance across different jurisdictions. Interestingly however, is that– at least with respect to the B2B space – the definitions under the ADPPA exclude some B2B data.

Managing consent “goes beyond legislation and regulation,” opines Gouachon. “Obviously, we’ve seen a lot of new laws and regulations that are becoming progressively stricter. But beyond that there are, at least in the short term, some novel challenges.”

“Most striking to our team was the number of activists that are really passionate about privacy topics and that are getting increasingly active. “I think everyone has heard of nyob, the Max Schrems organization. They recently launched an AI-based automation tool to automatically file complaints with local data protection authorities.”

Add to this the increased scrutiny of marketing practices and fines are getting more significant – particularly concerning cookies, transparency, data controlled by users, and dark patterns – and “with all this”, notes Gouachon, “the tech players, the brands, the tools, and ad tech ecosystem has had some ambiguous responses.

“Some have been very clear about their intent. Others are trying to duck their heads and take a ‘wait and see’ approach.”

A cookieless future

State laws have been creating exemptions for B2B or business-context data. They recognize that there is perhaps a lesser interest in keeping that data private or closely held. It’s the difference between dropping your business card on the ground versus your driver’s license:

In one of those cases, you’re going to run back and try to find it.

—Kara Larson, 6sense

“With respect to the federal privacy legislation,” continues Larson, “currently there is preemptive language in the proposed draft of the bill…so we aren’t looking at a patchwork of different laws and standards.”

“But regardless of what the patchwork of U.S. Privacy law looks like – and whether or not there’s a federal privacy law – we are seeing technology players start to make their own independent moves. Particularly when we get to third-party cookies.”

“We’ve seen Firefox do away with them, Apple and Safari do away with them, and Google has announced plans to also get rid of them. So, when we’re talking about third-party cookies, there’s not even the opportunity to try to collect consent anymore once they go away.”

“I can only agree,” says Gouachon. “The trend is here to stay. It is targeting mostly third-party cookies right now (but first-party cookies may be next).

“Brands and the technology solution providers can wait, or they can try to get ahead of the game. Last year Contentsquare launched our first cookieless solution – it’s not relying on any cookie technology at all (third-party or first-party cookie). This is in response to concerns that we see from some customers and brands. They are in effect already asking for solutions like these ahead of the legislation.”

Less data, more relevance for a better UX

The advent of cookies was a seismic shift for marketers

The amount of data and tracking you could do was enormous and no one at any point stopped to ask ‘do we need all of this data? If someone is shopping for shoes, do I really need to know that they drive a Kia and have a dog?’

There hasn’t been a lot of internal reflection about how useful it is.

—Kara Larson, 6sense

There is going to be a shift, offers Larson. Not necessarily one-to-one replacement but rather a holistic approach. “Maybe you’re concerned about display ad reach, specifically with targeting,” absent cookies. To compensate you can start to look at “relationships with these so-called ‘walled gardens’ like LinkedIn and Facebook because they are not relying on third-party cookies to do that identity resolution. You’re also looking at alternative identifiers.”

Several solution providers like LiveRamp are trying to solve this, notes Larson. In another approach, 6sense is making contextual marketing available to its customers – ads relevant based on what a consumer is currently viewing on a particular webpage. (If someone is shopping for shoes, you can forget about the dog.)

“What resonates most with me is the types of data brands are collecting about users,” responds Gouachon. At Contentsquare, “we think there is a way to personalize your approach and experience without compromising privacy.”

“How much demographic information do you really need about your users?” In a [brick and mortar] store a shopper isn’t asked dozens of questions about their personal information before a salesperson helps them, notes Gouachon. “So, we are really trying to replicate that experience and come as close to that experience as possible.”

There’s a misconception that online visitors are invisible. Not true. They give you real time feedback through all of their digital interactions without having to know anything else.

Understanding what they are trying to achieve online and how they want to go about it is really the key to delivering a superior customer experience.

—Arnaud Gouachon, Contentsquare

This is what our product team calls an intent-based approach, explains Gouachon. “We’re really interested in fixing what’s not working for them in their online journey and addressing those aspects that are frustrating the users and moving them away from their goal.

“While complying with legislation is table stakes – a no brainer – it’s probably much more valuable to innovate solutions that anticipate people’s desire for privacy. That trend is here to stay and it’s not just a legal or regulatory trend…. It’s a desire that most users have.”

Positioning for the future of privacy compliance

As discussed here, marketing and privacy have been on a collision course for some time now. But the deprecation of cookies does not have to be a win-lose scenario. It can be win-win. (Apple and others clearly see it that way.)

Indeed, there are benefits to gain with a cookieless approach. The benefit is really “an opportunity to build a trusted relationship with customers, partners, end users, employees, regulators, and NGO’s,” says Gouachon. “What we sometimes call ‘digital trust’”

While there are many initiatives brands and stakeholders can implement, it begins with transparency insists Gouachon. “Moving from checkboxes and cookie consent banners to a more centralized and user-friendly ‘privacy center’ approach.”

“It’s not compliance for compliance’s sake,” avers Larson. It’s about how you want to position yourself and your brand. Do you want to be at the forefront in thinking about these issues before they occur? Before they become a problem? Or do you want to spend all of your time trying to play catch up?”

Anything you do to increase the transparency of what’s happening with data – how it’s going to be treated, how it is going to be protected – raises that overall trust level.

And that’s how you start to flip privacy from roadblock to asset that builds your brand.

—Kara Larson, 6sense

This requires technology that “can respect and respond to a global privacy setting or ‘do not track’ signal, cautions Larson. “And we have seen with these proposed draft regulations for the CPRA that they are doubling down on the requirement to respect those signals.” The ability to do that effectively is, for Larson, “a key differentiator for a consent and preference management platform.”

She warns that trying to do this manually is a nonstarter. Automation is prerequisite not just to compliance but going beyond compliance and thriving in the fast-approaching world of consumer-focused transparency and trust.

Consent is about more than cookies and brands are looking at ways to get ahead of compliance in order to turn these challenges into opportunities. Looking to have more conversation around consent and the future of privacy? Let’s talk.

Listen to the session audio

  • Privacy
  • Regulations

What to Expect from Privacy Laws in 2023

Moderator Michael Hahn EVP and General Counsel of IAB and IAB Tech Lab, brought together a panel at the 2022 Summer Spokes Privacy Technology Conference (held June 22-23) to discuss the impact of various State privacy laws on digital advertising activities.

Joining Hahn for the session “What to Expect in 2023” which offers practical guidance on managing compliance are WireWheel Founder and CEO, Justin Antonipillai; Sundeep Kapur, Senior Associate Cyber, Privacy & Data Innovation at Orrick; and, Crystal Skelton, Senior Corporate Counsel, ZipRecruiter®.

One button or two?

It depends on the ways in which a business is selling or sharing data. “If selling data in ways beyond the sharing or targeted advertising,” says Skelton,” then it might make sense for a company to offer two. But if they’re conducting solely targeted advertising, I would expect to see one.”

“But It’s also a balancing act. I often wonder whether having two separate links could make it less likely that consumers will exercise both of their opt-out rights. I do appreciate that the draft CPRA regulations provide an alternative opt-out link option…but, as many of you already know, there are additional requirements that come with that, including the use of an icon.”

“It remains to be seen whether companies will more broadly adopt the icon with the alternative opt-out or choose to offer one or more links on their website. Having this more neutral language helps provide flexibility, especially in states that have similar requirements but aren’t as prescriptive about the language that must be used.”

“Just don’t make it look like GDPR in Europe”

As much as our community spends time understanding what targeted advertising and cross-contextual behavioral advertising is, it’s not the kind of thing you can just show up at your family reunion and everybody knows what you’re talking about. And we’re seeing a lot of focus on how to make this easy to understand.

—Justin Antonipillai, WireWheel

“There’s a lot of hesitancy about putting on a link that says, ‘do not sell my personal information’ or ‘do not sell or share,’” says Kapur. “Providing privacy options seems a more brand-friendly way of providing that experience.” The link can be used for multi-state compliance, but it depends on what you want to combine.

“If you’re a publisher, maybe you want one link, and you’ll just drop the traffic if they opt-out. Same with advertisers. If you’re using internal data, maybe you want two: one for targeted advertising and one for sale and sharing because you’re relying on first-party data.”

“Request number one,” says Antonipillai, “when implementing consent mechanisms – especially for companies who have experience abroad — is ‘I don’t want this to feel and look like it does in Europe.’ A customer experience that is “really difficult to do almost anything when you go to a website or an APP.” Companies in the U.S. feel that approach will cause people to disengage.

“We hear a lot of concern around making the user experience too complicated,” relates Antonipillai. “But if you start to abstract a lot of the individual choices that you get to a lot of different individual choices that might need explanation.”

The New Complexity of Consent Choice

The question becomes ‘how do we make it simple and create a frictionless user experience.

Legal Consent is definitely NOT a one-size-fits-all solution

How does ‘do not sell’ differ from ‘do not share’ or ‘targeted advertising’ as those terms are used in CPRA and the other state laws?” asks Hahn. “There is quite a bit of overlap.”

‘Do not share’ are disclosures specifically for cross-context behavioral advertising. That is to say, targeted advertising based off data obtained across non-affiliated digital properties. And any disclosure for that purpose requires a ‘do not share’ opt-out mechanism.

—Sundeep Kapur, Orrick

Whereas Kapur notes, ‘do not sell’ is “any disclosure for any sort of consideration.”

There are some differences. Do not sell is narrower. As a simplified example, Kapur offers if you’re sharing data with a measurement provider – measuring campaign effectiveness – that may not be considered a share but could still be considered a sale if you don’t have a service provider agreement in place.

“For targeted advertising, the CPRA uses ‘do not share my data for cross-context behavioral advertising,’ while the non-CPRA laws have an opt-out for the processing of personal data for targeted advertising. They don’t focus on the disclosure of data for targeted advertising, but more generally the processing of it.”

And that’s not “just disclosing data through the bitstream,” advises Kapur, “which could be a share and also processing for targeted advertising.” It also impacts “publishers that use a combination of third-party data and their own data, and the targeted advertising opt-out would cover that.”

“So, in a bit of an ironic way, the opt-out for targeted advertising under laws like Virginia and Colorado are actually broader than the ‘do not share’ under California,” says Kapur.

“It’s a really important point to emphasize that the scope of targeted advertising is broader than the opt-out for share-in furtherance of cross-context behavioral advertising.”

This answers the contention that I hear all the time, which is ‘California, must be the most stringent laws, so if I just comply with that, I must be good everywhere else.’ There is a fallacy in both the premise and the conclusion because actually the other laws are broader.

—Michael Hahn, IAB

“It’s definitely not one-size-fits-all. That’s the issue with having state-by-state privacy laws,” says Kapur.

Why do privacy laws have such a broad definition of “sale” of information?

“If you have a broad definition of ‘sale,’ with disclosures for monetary or other valuable consideration,” opines Kapur, “one could make an argument that when data is sent to a random ad server or across the pond to an adtech partner,” there is no valuable consideration there. “It is just disclosure.”

Ultimately, the broad definition of “sale” is to ensure disclosures for cross-contextual advertising and that the business has some sort of opt-in mechanism.

I recently read an interview with Alistair McTaggart stating that too many industry attorneys were taking the narrow view of sale (which was really never sustainable). McTaggart saw that position being taken, so put in the ballot initiative this new concept which has been copied into all other state laws.

It wasn’t the most rational conclusion from a drafting standpoint, but it was a result of not the most rational approach being taken by certain corridors of industry.

—Michael Hahn, IAB

“The primary difference under the definitions is whether it includes valuable consideration in addition to monetary consideration,” notes Skelton.

“In California, Colorado, and Connecticut, valuable consideration is included in the definition, whereas in Virginia and Utah, it’s not.” All States, however, include some sort of separate targeted advertising or cross-contextual behavioral advertising component.

It’s an interesting place to be in right now because you want to potentially have a single mechanism to comply across the board. But you’re essentially playing whack-a-mole when you get these various state laws with different definitions, components, and requirements. It can be a precarious place to find yourself when you’re thinking about across-the-board compliance on a state-by-state basis.

—Crystal Skelton, ZipRecruiter

New contractual obligations under US privacy laws

“It’s not sustainable to have separate contracts for separate jurisdictions,” states Skelton. “For example, often, when you’re doing targeted advertising, you’re targeting consumers nationwide (or you’re using third parties to do so) and not necessarily using a state-by-state approach.

Updating privacy and data security addendum templates to include the greatest common denominator to address all these State requirements may be a good approach, at least to start with, but you are going to have to navigate those specific differences in definitions and compliance requirements.

Keep in mind,” cautions Skelton, that “under the CPRA draft regulations are due diligence requirements for service providers and contractors. How can one reasonably do that in order to rely on the liability defense under the CPRA?”

“In some cases, it’s very difficult (though not impossible),” offers Kapur, “to get the right contractual privity. Certainly, when talking about the adtech ecosystem. For example, under the CPRA there is a requirement where if you are sending data to a third-party (aka, a non-service provider) you need have a contract in place with that third-party describing the nature of the sale/share and other information.”

In some cases, if we take the broad view – which is certainly the view that regulars have been taking – looking into the nitty-gritty of the ecosystem, it can be really difficult to get where everyone can sign on to something without some sort of industry-wide mechanism

—Sundeep Kapur, Orrick

“For example, when you’re pinging a third-party advertiser ad server, that discloses personal information plus an IP address. If we’re going to take the broad approach and err on the side of caution, how do we get an agreement? There’s definitely an issue there.”

Liability, compliance, and diligence

“Just when you thought you knew the law, you didn’t,” says Hahn. “You thought you didn’t have liability for your partners, unless you had knowledge – or reasonably should have known – what they were doing. But now you don’t have that insulation unless you’re doing diligence.

It took a little digesting just to wrap my head around what does this actually mean in practice. The sheer scope of what is potentially required by this. It not only goes through the procurement process. You’re looking at new vendors and your agencies.

—Crystal Skelton, ZipRecruiter

“You have to set up a regular cadence for review…it’s a tough position to try to be in. How do you start tackling this? Do you put in place these due diligence requirements now, or do you take a wait and see approach? These are draft regulations that may change. And it’s a significant burden,” opines Skelton.

I have been hearing about a few different approaches, says Hahn:

  1. An entirely unique experience with respect to each state
  2. Treating California consumers one way and then creating a common experience that complies with all the other laws (as those laws have greater commonality with each other than they do to California), and
  3. Not determining location or residency of anyone who comes to the site and taking a national approach. Try to create a common set of baselines that will (hopefully) comply with all of the laws.

“A very important voice in this entire process is the CMO and the head of digital marketing who are trying to think through the customer experience,” opines Antonipillai.

Even when one explains what the choices are supposed to be to the consumer, and you start trying to make it simple, it comes across very confusingly. It’s exceptionally hard to explain what the consumer’s choices are. Even to an expert audience.

I see a lot of motion towards simplicity – trying to get a good consumer experience.

—Justin Antonipillai, WireWheel

Antonipillai goes on to note that WireWheel has been helping clients implement due diligence requirements for some time, “but I wouldn’t have guessed that it would have to be for everybody under all circumstances. That’s a huge undertaking.”

Importantly, says Antonipillai, the draft CPRA regulations “suggest that California is generally an opt-out place. However, if you’re using data in a way that’s not reasonable and proportional to the way that the consumer believed it would be used it almost starts to suggest that it becomes opt-in.”

This too makes the consumer experience very tricky. And tricky for business.

Looking to learn more about what is coming in 2023? Let us help you in your compliance journey.

Listen to the session audio

  • Privacy Law Update

Privacy Law Update: July 25, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!

Newsworthy Updates

American Data Privacy and Protection Act heads for US House Floor

Despite facing a time crunch, a flood of stakeholder feedback and unforeseen Congressional opposition, the proposed American Data Privacy and Protection Act keeps on chugging.  The bill’s next act will come in the U.S. House floor after the House Committee on Energy and Commerce markup July 20 resulted in a 53-2 vote to advance the bill to full House consideration. The vote to advance marks the first time a comprehensive privacy bill will be made available for a full chamber vote in either the House or the Senate.

State Attorneys General Oppose Preemption In Proposed American Data Privacy And Protection Act

California Attorney General Rob Bonta today led a coalition of ten attorneys general in urging Congress to respect the role of states to enforce and provide for strong consumer privacy laws while advancing legislation enacting long-overdue privacy protections nationwide. The states call on Congress to create a baseline of consumer privacy laws that do not preempt states’ ability to respond with legislation to address changing technology and data protection practices. Numerous states already have strong privacy protections in place — including California — and state laws and enforcement are critical to protect consumers and their data.

How UK Data Protection Bill Stacks Up With EU GDPR, ePrivacy Framework

On July 18, 2022, the UK government introduced the Data Protection and Digital Information Bill ‘DPDI Bill’ to Parliament. Previously known as the Data Reform Bill, it is the result of a consultation from 2021 and its aim is to update and simplify the U.K.’s data protection framework. According to the U.K. government, the new legal framework created by the DPDI Bill will reduce burdens on organizations while maintaining high data protection standards.

How Canada’s CCPA Differs From PIPEDA

In 2020, Shaun Brown wrote about what he considered a significant flaw under the proposed Consumer Privacy Protection Act in Bill C-11, which was tabled in November 2020, and then died when the federal election was called in 2021. Bill C-11 retained the definition of personal information — information about an identifiable individual — but introduced a new concept of “deidentify.” This seemed to, by implication, alter the concept of personal information, expanding the scope of federal privacy legislation and tossing away years of judicial guidance in the process. Bill C-27 would do this as well, though in a slightly more complicated way.

CAC Readies $1b Fine For Data Security Violations

The Cyberspace Administration of China plans to fine Chinese ride-hailing company Didi Chuxing more than $1 billion in relation to alleged insufficient data security practices, The Wall Street Journal reports. The fine is the last remedial step Didi faces as part of a yearlong investigation by the CAC, which removed the company’s mobile applications from China’s app stores over data security concerns in July 2021. Payment of the fine would restore Didi apps and allow the company to begin a new share listing in Hong Kong.

Different Approaches to Data Privacy: Why EU-US Privacy Alignment in the Months To Come Is Inevitable

Even though it is hardly disputable that origins of modern data privacy, as well as computer technology, are to be found in the US, it is currently the EU with its GDPR that sets the global tone in terms of what is the generally accepted privacy standard, especially for multinational companies operating worldwide.

Examining the Intersection of Data Privacy and Civil Rights

For historically marginalized groups, the right to privacy is a matter of survival. Privacy violations have put these groups at risk of ostracization, discrimination, or even active physical danger. These tensions have long pre-dated the digital age. In the 1950s and 1960s, the government used surveillance programs to target Black Americans fighting against structural racism, with the Federal Bureau of Investigation’s (FBI) Counterintelligence Program (COINTELPRO) targeting Dr. Martin Luther King, Jr. and members of the Black Panther Party. During the HIV/AIDs epidemic, LGBTQ+ individuals were fearful that with an employer-based healthcare system, employers would find out about a doctor’s visit for HIV/ AIDS and that individuals would then face stigma at work or risk losing their jobs.

Privacy Legislation

American Data Privacy and Protection Act: On July 29, the House Energy & Commerce committee voted to advance the American Data Privacy and Protection Act to the full House by a 53-2 vote. The only nays were California Representatives Eshoo (D-CA) and Barragán (D-CA).

The Committee considered a number of amendments to the ADPPA, summarized below (in order of appearance):

  • The overarching Amendment in the Nature of a Substitute from Chair Pallone and Ranking Member McMorris Rodgers (discussed in yesterday’s message). The AINS was adopted by voice vote.
  • An amendment from Reps Lesko (R-AZ) and Kuster (D-NH) to exclude NCMEC (National Center for Missing & Exploited Children) from the Act was adopted by voice vote.
  • An amendment from Reps Trahan (D-MA) and Bucshon (R-IN) intended to clarify the ‘permissible purpose’ for sharing data for conducting public interest research was adopted by voice vote.
  • An amendment from Reps Castor (D-FL) and Walberg (R-MI) expanding ADPPA’s ‘Privacy by Design’ requirements to identify, assess, and mitigate privacy risks to minors in an age-appropriate way was adopted by voice vote.
  • An amendment from Reps McNerney (D-CA) and Curtis (R-UT) authorizing the FTC to promulgate regulations (in consultation with NIST) establishing processes for complying with the ADPPA’s data security requirements was adopted by voice vote.
  • An amendment from Reps Carter (R-GA) and Craig (D-MN) reinserting requirements for covered entities to appoint data privacy and security officers (but exempting businesses with under 15 employees) was adopted by voice vote.
  • An amendment from Reps Hudson (R-NC) and O’Halleran (D-AZ) reinserting revised language on service providers and third parties was adopted by voice vote.
  • An amendment from Rep. Eshoo (D-CA) that would limit ADPPA’s preemptive effect to only provisions of state laws inconsistent with the Act failed by a 8 – 48 vote.
  • An amendment from Rep. Walberg (R-MI) that would expand ADPPA carveouts applicable to small businesses was offered and withdrawn.
  • An amendment from Rep. Hudson (R-NC) that would explicitly provide that ADPPA covered entities will not be covered by FCC privacy laws and regulations was offered and withdrawn.
  • An amendment from Rep. Curtis (R-UT) focused on advertising that would provide, in part, that the definition of “targeted advertising” does not include “first party advertising or marketing” was offered and withdrawn.
  • An amendment from Rep. Long (R-MO) that would strike the ADPPA’s explicit grant of enforcement authority to the California Privacy Protection Agency (seemingly based on a concern that it could provide California a preeminent role in the interpretation and implementation of the ADPPA) was offered and withdrawn.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Regulations

Preparing for Federal Privacy Law Regulations Coming Down the Pipeline

We’ve all discussed the five state laws, but there is, for business, a great desire for a federal law, just because it’s way too complicated to manage so many different states, never mind the extra-territorial stuff. There is a desire for things to be easier and more favorable to business. It is a push-pull and it’s a very difficult line to walk.

—Susan Raab, Customer Data Platform Institute

Indeed. There has been much discussion concerning state privacy law, their similarities, differences, and strategies for managing what’s coming in 2023 and beyond. Here we turn our attention to the emerging federal privacy law which is gaining unprecedented momentum. To discuss Preparing for Federal Regulations Coming Down the Pipeline, a panel of experts joined moderator Cobun Zweifel-Keegan, D.C. Managing Director, IAPP at the Spokes Technology Conference (held June 22-23).

Zweifel-Keegan was joined by BDO Privacy and Data Protection Director Jeremy Berkowitz; Susan Raab, Managing Partner, Customer Data Platform Institute; and Jessica L. Rich, Kelley Drye, Of Counsel who was previously with the FTC for 26 years.

Federal privacy law backdrop

“Privacy has been on this country’s radar in a big way since the late 90s. We don’t have a comprehensive federal privacy law. Instead, we have sector-specific laws that apply to a particular market sectors, entities, or data like COPPA, Gramm-Leach-Bliley, FCRA, and FERPA” notes Rich.

“The law with the broadest coverage is the Federal Trade Communications Act which broadly prohibits unfair deceptive practices, including privacy and data security which apply to a very broad array of entities.”

What many don’t realize is that the Federal Trade Commission (FTC) also has jurisdiction in areas where other agencies play a role notes Rich. “For example, HIPAA is enforced by HHS and covers Health entities, but the FTC has jurisdiction over many of those same entities. The same is true regarding COPPA and FERPA covered entities.”

But, the FTC Act, which is the main privacy law in this country, doesn’t set forth privacy standards. It only allows the agency to act after the fact and determine whether something’s unfair and deceptive. There are gaps in jurisdiction, gaps in remedies, and the FTC has very limited rulemaking capacity.

—Jessica L. Rich, Kelley Drye

“The result is 20-years of ongoing Congressional debate about whether to pass a federal privacy law.”

Rich notes that the FTC is considering rulemaking using its “very cumbersome authority,” viewing privacy through both a competition and a consumer protection lens with a lot of focus on leveling the playing field between big and small companies. And using more substantive (and prescriptive) provisions like limiting the use of data (rather than the notice and choice approach).

What are we likely to see in terms of enforcement for federal privacy laws?

“I want to bring in the context of what we’re now calling the ADPPA: the American Data Privacy and Protection Act” (H.R. 8152). “That’s where all the attention has been in Congress over the last couple of weeks, says Berkowitz, “and it is remarkable to see a bill right now, where there seems to be a large consensus between both parties on what should be in this bill.”

That said, the history of attempts by the States, as noted here, demonstrates both cross-party cooperation and “intra-party strife. Both blue states and red states were not able to reach accord on this issue despite a strong desire from all concerned to pass legislation.”

And indeed, as Berkowitz goes on to note, “there seems to be a lot of consensus, particularly on some issues around preemption and the private right of action. However, in the Senate, committee chairperson, Senator Maria Cantwell (State of Washington) is not yet on board with this bill. She hasn’t come out against it, but of all the players in both houses on the relevant committees, she’s the only one who has not signed off on this yet.”

New authority and consensus for the FTC

…getting money, money, money whatever way they can. By partnering with the states or alleging rule violations in creative ways, because you can get money when you allege rule violations, not just Section 5.

—Jessica Rich, Kelley Drye

Notably, the ADPPA provides a lot of new authority to the FTC including 1) a new bureau and staff to the FTC to be able to enforce the act, 2) the ability to be able to promulgate rules, particularly around data minimization and consumer requests requirements, and 3) a requirement for companies to certify once a year that they have a CPO and DPO.

“This bill – at least its current form – is going to provide a lot of that much needed authority,” opines Rich, and notes that FTC Chair Kahn has a more power now “to push through her agenda. Now that she has a majority, you’re probably going to see some more aggressive action over the coming months, regardless of whether this law gets passed or not.”

Rich further notes that FTC enforcement is both expanding and taking broad interpretations of existing laws. For example, “they basically took a very narrow rule – the health breach notification rule – and said it applied to every health APP.

One of the things that I would emphasize, though, is so far, this has all been through settlements. There are a lot of very good arguments that companies could make as to why some of this stuff goes beyond the FTC’s authority.

But most companies, given the cost of litigation, are going to settle.

—Jessica Rich, Kelley Drye

“But I think it’ll get a lot more interesting,” suggests Rich, “if companies start pushing back on some of these aggressive remedies.”

Children’s privacy likely to play key role in new federal privacy law

Zweifel-Keegan notes that historically, the FTC has been highly active on children’s privacy. The Children’s Online Privacy Protection Act granted specific rulemaking and enforcement authority over children’s privacy issues. And in 2019 the FTC began the process of updating COPPA.

“Children’s privacy is its own animal, says Raab. “It’s a sector but it’s also a kind of microcosm of privacy because it ties in with everything: education, healthcare, sports, all of it.”

But what is particularly unique about children’s privacy is “who’s allowed to give consent and what’s allowed to be held.”

As any parent knows, children’s data starts to be put out there before a child is born and on through their life. Data for which they had no input and were incapable of giving consent. At what point does the child or an individual get control of their own personal data that’s been out in the universe?

—Susan Raab, Customer Data Platform Institute

“The reason Education Technology (EdTech) is so important is that it is a giant black hole when it comes to protecting children’s data. There are lots of ways in, and a lot of people know it. It’s the weakest link in the data ecosystem.”

Importantly, reminds Raab, “if you can get children’s data, you can get the whole family’s data as well. A lot of places think they don’t need to worry because they don’t deal with children. But in fact, every company holds children’s data, if in no other place but in human resources, so it’s very complicated.”

“From a legislative perspective,” says Berkowitz, Senate Commerce Committee members Richard Blumenthal and Marsha Blackburn could not be further apart…but have come together on the Kid’s Online Safety Act (S.3663) that was introduced in February.

Raab points out that Senators Markey and Cassidy also have their own bills that looks to evolve COPPA. “It’s going to be interesting to see where things go from here.”

Teens don’t care about privacy

Part of the Children’s and Teen Online Privacy Protection Act (S.1628), notes Raab, is to “capture the tween category the 13 to 15 category. But even trying to manage the privacy needs for young people of one age are different than the need over time.

“Once you start to hit ages where youth can participate in it, where the children give consent, and you know this migrates on through. With children, you always have different gatekeepers whether it’s a teacher or a parent or caregiver. Plus, you have the children themselves.

I was involved in COPPA in its early stages and later. The FTC – not that it had control over this –when asked whether COPPA should be extended to teens said the consent model doesn’t really work well with teens because they don’t care.

—Jessica Rich, Kelly Drye

“They certainly don’t want their parents giving consent for them,” continues Rich, “so when it came to actually standing COPPA, there wasn’t a lot of support for it. Now we have these bills that are more about age-appropriate codes: don’t serve up toxic content to teens and needing to ‘know’ who you’re dealing with.

“Maybe get consent from them or their parents for certain uses of data, but it’s complicated and it’s not easy when it comes to teens. “I do think it’s important that any protections that apply to adults in a large bill like the ADPPA apply to everyone and that there’s additional protections for teens and kids.”

The knowledge standard

“There was a hearing…an active discussion about the ‘knowledge standard’ and how a company can know they’re dealing with a child or a teen, relates Rich.

“And what they put out [in the ADPPA] is a draft that settled on ‘knowing.’” What knowing means exactly is a concept, as Rich notes, on which the courts have opined.

“This gets to what companies can know, what do they want to know, and, sometimes, what they want to know is not so much if it gets in the way of what they want to do,” said Jessica Rich.

When you’re thinking about the children’s components and some of the new rules that are going to be around duty of loyalty and data minimization, there is a consensus that this is a growing problem. We also need to think about how we want to manage it from a from a risk perspective.

—Jeremy Berkowitz, BDO

How all of this plays out, and when (and if) federal privacy laws finally arrive, one thing is certain. “In terms of the trajectory of how far we’ve come it’s incredible,” opines Rich. “Privacy has arrived for Republicans and Democrats. For businesses and for consumers, it’s a huge shift.”

Listen to the session audio

  • Privacy
  • Regulations

How To Become A Chief Privacy Officer (CPO)

The Privacy job market continues to accelerate. There are more entrants every day and robust movement between companies. Privacy is now recognized by many companies as not simply a compliance requirement, but rather increasingly an area of competitive differentiation critical to business success. As such, those accumulating requisite experience and know-how are sought after and moving up the ranks as programs mature and capture increased funding.

One thing is clear, there has been no single pathway to privacy or its top slot. The road for today’s leading privacy professionals is typically a winding, forked, anything but straight path. This is evidenced by the many panelists and attendees at the Summer 2022 Spokes Privacy Technology Conference, (held June 22-23) and every year since its inception.

So, what does it take to secure the top slot? How does one become a Chief Privacy Officer (CPO)?

Host, George Ratcliffe, Head of Data Privacy and GRC Search at executive search firm Stott & May was joined by VMware’s VP and CPO Stuart Lee and Noga Rosenthal, CPO and General Counsel of adtech company, Ampersand to discuss How to Become a CPO and provide sought-after insight.

Key CPO attributes

“Privacy is one of those fields where you really need to have a broad understanding. It’s a renaissance field in every sense of the term. As a privacy leader, you will be pulled into meetings from HR, sales, and marketing to IT and Customer Success. You really need to be able to understand the different needs and areas of the business.

—Stuart Lee, VMware

As a result, a successful privacy leader needs to be able to speak the language of the business and industry that you’re in to really help get your point across.”

And of course, as Lee notes, you need a deep understanding of what the privacy requirements are, how they playout globally, and how they impact your business. “There really is a lot there.”

Importantly, “it’s not just enough to be able to recite law verbatim. You need to understand how you can communicate those requirements back to your stakeholders” while maximizing value for the company and doing right by the customer.

“I think a lot of us here today, and indeed anybody who’s a CPO, DPO, or privacy leader in any way, walks that tightrope every single day: how do we make sure we’re meeting [customer] requirements and expectations, while helping our business to do what it needs to do.”

For Rosenthal, the soft skills are critical. “Being able to speak clearly and make things understandable to everyone, and not speaking in such a high level that nobody has any idea what you’re talking about.

I remember the first time I said to my marketing team ‘Hey, we I don’t want to use the word anonymous; can we use the word pseudonymous?’ And they all just looked at me like ‘I can’t even spell that, what are you talking about?’

—Noga Rosenthal, Ampersand

The ability to be flexible and deal with ambiguity are next on her list of critical “soft skills.” She cautions that what can make the job so difficult is that the laws aren’t very clear. “We have to use our instincts, we have to use benchmarking, we have to look at risk. And sometimes people want a job that’s very clear cut with drawn lines. This is not that job.”

Lastly and perhaps most importantly, it’s about building trust.

Are certifications really necessary?

Rosenthal has a significant history as a privacy professional, and “It was interesting to me.” She relates that as “I was going to DC, speaking on panels, I was applying for a job, and they asked me for it for my CIPP certification. I thought, ‘really?’” To be clear, she does think certifications are helpful. “It’s a checkbox.”

“There’s no replacing grey hair,” says Lee. “You just have to have the experience of going through a lot of the [privacy] exercises, because privacy has long been principle based and when you’re interpreting principles and applying it to what you do, then that often based on the experience of what you’ve seen work and fail.”

—Stuart Lee, VMware

Lee tells the people he works with that “it’s great to have the certifications, but if you don’t have the experience to show that you know what that means it doesn’t really help. That’s the key part.”

Stuart notes that years ago it was the Data Privacy Officer (DPO) that was “the first person invited to any party, but in the event of a regulatory investigation or an incident your DPO became the host of the party.”

The privacy lead is the one involved in instant response and working with regulators. “Trust is incredibly important and it’s a 360-degree relationship with your stakeholders, customers, and the regulators. There is no replacing experience.”

Ultimately, while credentialing may help at the start, becoming a Chief Privacy Officer (CPO) is about experience.

What is the executive recruiter looking for?

“The all-arounder,” says Ratcliffe. “We get a lot of questions about the lawyer v non-lawyer debate but setting that question aside for the moment, “it’s absolutely somebody that can cover all bases. And somebody that is an excellent communicator can build that trust.” Interestingly, says Ratcliffe, “the conversations around the softer skills go on for far longer and are far higher up the agenda than the harder skills that come to you later on around the technology side.

The softer skills and the cultural alignment between the company and the individual takes up 75% of the conversation. And is the role right for you or is it just the title?

—George Ratcliffe, Scott & May

It’s not enough just to say I’ve managed DSARs, and I’ve done X number of reviews. It’s how have you actually made the process something that is really a strategic initiative. Really understanding how those are going to impact the business.

Ultimately, “when we’re looking at executives, whatever the industry, being able to tie in business objectives and goals [in terms relatable to the various stakeholders] is super important,” says Ratcliffe.

Clearly, the CPO role requires the ability to relate the arcana (and nuances) of privacy to your stakeholders across the organization – from sales and marketing to IT and operations, from executive management to your consumers to gain the buy-in necessary and be successful. This takes exceptional communication skill.

“The harder skills,” opines Ratcliffe “are easier to develop with certification training courses and the other types of things that you should be picking up naturally as you develop throughout your career.”

Lawyer versus non-lawyer

A legal background has “certainly been the path of least resistance for filling the role of a privacy officer,” offers Lee.

“Where it becomes super interesting is when you really think about what your chief privacy officer is charged to do, versus what a data protection officer is charged to do, and what counsel is provided to do.

You’ve done the education on the CIPP; you have the experience of doing privacy risk management and DSARs – often working directly with business. None of those things that you picked up along the way you’ve learned by completing a legal education. You’ve got it through your experience in the field.

—Stuart Lee, VMware

That said, continues Lee, you absolutely have to have a legal counterpart. “I would also argue that if you’re a CPO, who is a lawyer, you should have a really good business operations person with you as well,” suggests Lee.

In fact, when you look at Data Privacy Officers (DPOs) who have been around much longer than CPOs research showed that approximately 28% were lawyers, and another 28% were IT professionals, notes Lee. “There’s a huge kind of balancing act and It’s often determined based on the industry you’re in. if you’re in a very highly regulated industry it will likely lean more towards favoring lawyers as a CPO.”

As a lawyer, and playing devil’s advocate, Rosenthal counters, “of course, it should be an attorney because you’re taking laws, interpreting them, and that’s what legal should be doing” rather than breaking the role in two and having privacy go to legal to get the interpretation.

“Another piece to consider” offers Rosenthal “is you’re doing contract negotiations. You want it to be an attorney. When you’re negotiating, you’re usually negotiating with another attorney, so you need two attorneys talking to each other, though that’s not always the case.”

However, some of the strongest privacy CPOs out there are not attorneys.

—Noga Rosenthal, Ampersand

So are there challenges for someone coming from a legal background asks Ratcliffe.

“I’ve had attorneys work for me from the commercial side switch over to privacy and the greatest struggle” says Rosenthal is the lack of clarity in the law. And “they don’t get that.” She further points to the disadvantage stemming from a lack of knowledge regarding things like cookies and browsers.

“Your IT guy knows all your networks, all your systems, and that’s a huge advantage. You can jump right in and understand where the data is coming from.” So conversely, here is where the Lawyer does have to go elsewhere.

Lawyer or not – privilege, contract negotiations, and interpretation of the law notwithstanding – it is fair to say that the CPO can’t go it alone. Allrounder or not. But the emerging law in the U.S. is clear: whatever your background, a CPO must be “qualified.”

Listen to the session audio

  • Privacy Tech

Privacy Law Impacts to AI and High-Risk Processing

AI and High-Risk Processing

Day one of WireWheel’s Summer 2022 Spokes Data Privacy Technology Conference (held June 2-3) featured the discussion AI and High Risk Processing focusing on issues concerning regulation and development of privacy law concerning artificial intelligence (AI) and automated decision making.

Moderator Lee Matheson, Senior Counsel for Global Privacy at the Future of Privacy Forum was joined by several leading experts including his colleague Bertram Lee, Sr. Counsel on Data and AI Future of Privacy Forum. Notably, Lee very recently gave testimony before the House Commerce Committee on the newly introduced Proposal for American Federal Data Privacy Law.

Also joining was King & Spalding LLP Data Privacy & Security Partner Jarno Vanto and Christina Montgomery, Vice President, CPO, and Chair of the AI Ethics Board at IBM.

Montgomery also serves as a member of the Commission for Artificial Intelligence Competitiveness and Inclusiveness as part of the U.S. Chamber of Commerce and has recently been appointed to the United States Department of Commerce’s National Artificial Intelligence Advisory Committee.

The Artificial Intelligence (AI) regulatory state of play

In terms of how artificial intelligence is starting to get regulated we’re seeing the world fall into three different camps:

  1. The more prescriptive approach, adopted by the European Union, where you have a regulation on what types of AI cannot exist, what types of AI are high risk, and what types of AI are low risk.
  2. A self-regulatory environment coupled by government enforcement when the self-regulatory frameworks fail.
  3. The sectoral approach, currently prevalent in the United States with different government entities issuing rules and statements about AI in the sectors that they administer

One strategy companies are using to deal with the “sectoral approach” is looking at which regime to follow. “What I’m starting to see is that it’s just easier for companies to comply with the strictest regime.”

In some ways, the EU has emerged as a global regulator of artificial intelligence, at least with the initial steps they’ve taken. And unless companies are willing to build regional AI tools that comply with the local regulations (which would be enormously costly and difficult to manage) it would mean that standards around [prohibited] AI, or judgment calls around what constitutes “high risk,” would be adopted globally.

—Jarno Vanto, King & Spalding LLP

Of course, U.S. companies may view these EU efforts as mandates on U.S. companies, given the fact that most of these AI tools are currently being developed in the United States.

How to approach artificial intelligence (AI) governance?

Our approach – even in the absence of regulation – is to start with your values and your principles. I think that’s the only way to design a governance program around AI, because so much of it has to be values led.

—Christina Montgomery, IBM

“The core of internal governance over AI regardless of the regulatory environment is the ethics and principles that govern development,” offers Matheson. What the law says matters of course, but certainly there are broader ethical considerations “like what do we want the company to stand for?”

These considerations are not solely tied “to how algorithms make use of data, but more broadly with use of data generally,” opines Vanto. “When developing these tools, the first thing companies should keep an eye out for is the purpose for which the tools will be used.”

Many companies have implemented data use-case ethics boards and similar bodies to contemplate this and will say no to the potential monetary gains if they view the use-case as unethical or inconsistent with their approach to use the personal information.

—Jarno Vanto, King & Spalding LLP

“With AI, it is the very same assessment, continues Vanto. “Is the purpose consistent with the values of your company?”

Civil rights advocacy and artificial intelligence (AI)

You can’t just rely on civil society groups to do the work that companies should be doing themselves. Ideally you would want civil rights feedback to come from inside the company…so that when these products and services are presented to advocacy organizations, the thoughts and considerations of those communities have already been thought through.

—Bertram Lee, Future of Privacy Forum

“From a policy perspective,” continues Lee, “one thing that might be helpful to think about with respect to the civil and human rights community is that Civil Rights law has prohibited discrimination in a variety of contexts, and in a variety of ways for the better part of 60 years.

“That context is important because when I hear from companies that the law isn’t clear [the question then becomes] how are you compliant in spirit? What is your best effort…with respect to non-discrimination?”

There is coming regulation on privacy. No doubt there is going to be some form of algorithmic mandate or accountability from Colorado or California, maybe even Virginia…The algorithmic biases issue is on its way.

For everyone involved, it makes the most sense to think about how to test for nondiscrimination. How your data sets are discriminatory, and how you’re fighting against that actively. What are the ways in which this AI could be used that could be discriminatory?

—Bertram Lee, Future of Privacy Forum

“All should be asked and answered before even thinking about deployment and there should be a clear reasoning behind it,” assets Lee. “The recent settlement with HUD is an example and folks are slowly waking up to their liabilities in this space.”

Managing artificial intelligence (AI) principals at scale

“How do we affect real transparency for a complex algorithmic system?” asks Matheson. “How do we regulate data quality for training data sets that have billions of data points – especially on the scale of IBM?”

“We have to make it easy for our clients,” says Montgomery. “That the tools we deploy are giving them the capabilities they need to be confident that the uses they’re putting AI to are fair, transparent, explainable, and nondiscriminatory.

The IBM Ethics Board and the Project Office that supports it is within my purview of responsibility. We designed our program to be very much top-down bottom-up because we wanted the tone from the top…helping set the risk for the company, instill its importance, and hold us accountable…But importantly, also ensure that multiple voices are heard. That we’re incorporating the voices of marginalized communities as well.”

Montgomery further notes that IBM has approximately 250,000 employees globally and has created a network of multidisciplinary “focal points” throughout the company that comprises both formal roles and an advocacy network of volunteers to support this effort. The result is a culture of trustworthiness.

The how is where it becomes really tricky. It’s one thing to have principles. It’s one thing to have a governance process which is really central to holding ourselves accountable and helping to operationalize those principles. But we have to tell people how.

—Christina Montgomery, IBM

IBM has a living document called Tech Ethics by Design explains Montgomery. It walks designers and developers through the questions they should be asking and through the tools they should be using to test for bias. And it gives them data sheets to document the data being used throughout every stage of the lifecycle.

But IBM doesn’t go it alone says Montgomery. IBM also collaborates with external organizations like the open-source community and is currently funding a lab at the University of Notre Dame.

Will regulation help with the issue of artificial intelligence (AI) bias?

“We often see – whether it’s a prescriptive regulation or voluntary self-regulatory framework, or even just a statement of principles – people get lost in the weeds of how to do the compliance,” says Matheson.

“I would love to see a co-regulatory approach, says Montgomery, “and IBM has been calling for precision regulation of artificial intelligence to regulate the high-risk uses. Not regulate the technology itself. We’re supportive of guidance, frameworks, and regulation in this space, but it’s important to have that regulation be informed by what businesses are seeing and balancing innovation and risk.”

“I agree,” says Vanto, “actual business practices should be factored in so regulatory work doesn’t happen in a vacuum. But it’s interesting, if you’re looking for example, the list of the high-risk and prohibited systems, they’re value-based judgments.” This shouldn’t be set in stone as there a use cases that we can’t even conceive of today, and “having that in regulation that takes years to change, may be challenging.”

Instead of setting in stone what constitutes a high-risk activity now – the prescriptive approach – we should have certain criteria based on which certain systems or use cases should be considered high risk or prohibited altogether…because there might be others down the line very quickly as these things develop.

—Jarno Vanto, King & Spalding LLP

Self-regulating artificial intelligence (AI)

“While I agree that we don’t want to necessarily stifle innovation – there are ways in which these technologies could use to benefit all of society – we have to understand that the data sets that underlie a lot of these systems are all based in discrimination,” contends Lee.

“If folks could self-regulate, we wouldn’t be having some of the same problems that we’re having right now, because there would have been someone who was in the room, saying let’s revaluate.”

As an example of internal evaluation working, Matheson notes that IBM chose not to offer API’s for facial recognition software which was publicly announced.

It comes back to the first point I made underpinning our governance framework are the values and the principles that we align ourselves to, beginning with data should augment not replace human decision making. It should be fair, transparent, explicable, privacy-preserving, secure and robust.

—Christina Montgomery, IBM

Montgomery notes that IBM had a number of proposals come forward during COVID-19 to, for example, use facial recognition for mask detection or fever detection for deployment in various airports or businesses. The concern was how guard rails could be put around the different technology types. The details of IBM’s decision making process were published by IBM in the report “Responsible Data Use During a Global Pandemic.”

“Ultimately, facial recognition (at least at the time) presented concerns regarding accuracy, fairness, and the potential for it to be used for mass surveillance or racial profiling. That coupled with the questions around the technology itself led IBM to the decision [not to deploy] facial recognition.

“We wanted to be very clear that we weren’t making different decisions, just because we were faced with this exigent circumstance. We were still relying on our governance process and still adhering to our values and our principles,” declares Montgomery.

This last point cannot be stressed enough. To jettison principals and values, even in exigent circumstances (the rallying cry of a long line of malefactors) renders the very concept of values and principals as nothing more than expediencies to be used or tossed as circumstances “require.” The very antithesis of “principals.”

Listen to the session audio

  • Company
  • Privacy Tech

WireWheel DSAR Connector for Drupal & WordPress

As new privacy laws continue to emerge in the United States, businesses are looking for increasingly straightforward ways to support consumers’ appetite for privacy. WireWheel is excited to announce the launch of our first set of WireWheel integrations for Drupal and WordPress. These connectors make it easier than ever to integrate data subject intake requests (DSAR) intake forms into your existing user privacy experience.


As DSAR requirements continue to be upheld under laws like CCPA, CPRA, and GDPR, privacy teams can anticipate similar requirements to emerge with new passing legislation.

The open-source WireWheel connectors allow organizations to seamlessly integrate WireWheel’s Privacy Studio for DSAR intake directly into their existing WordPress or Drupal websites. Setting up and launching the WireWheel DSAR Connector for a WordPress or Drupal site is simple, fast, and easy.

DSAR Connector Benefits

  1. Plug & Play Integration
    Enables a simple and easy integration with WireWheel’s DSAR management solution (Trust Access and Consent Center) to any existing Drupal and WordPress websites
  2. Fully Integrated User Experience
    Take full control over how DSAR request intake forms are displayed to users and integrated with your existing website experience.
  3. Flexibility & Control
    Provides full control and flexibility over the DSAR intake user experience on your website

Who can use the WireWheel Connector?

The WireWheel Connector can be used by anyone who has a Drupal or WordPress website and has purchased the WireWheel Trust Access & Consent product for data subject access requests (DSAR). The connector itself is free and open-source but requires a WireWheel subscription to utilize its functionality.

DSAR Connector Features

Once installed and configured onto a website, WireWheel’s DSAR Connector offers Drupal and WordPress users these key features:

  • Easy integration with WireWheel’s Trust Access and Consent Center for DSAR automation, processing, and management
  • Fully customizable intake forms to fit your needs (styling, placement, number of forms, supplemental content, etc)
  • Unlimited creation & placement of forms to handle complex flows, multi-form processes and regional intake requirements

Taking advantage of the power of the WireWheel data subject access request management is now easier than ever for customers who are using Drupal or WordPress.