SPOKES Virtual Privacy Conference Winter 2022

Register for Free


  • Privacy

Data Security vs Data Privacy: Is There a Difference?

Could you sum up the difference between data security and data privacy in a single sentence?

It’s not impossible, but if it were, it would be quite a long sentence. So instead, we’ve written this article, which will explain the difference in some detail.

We’ll look at what’s involved in data security and data privacy, note a couple of key distinctions between them, and give you a few tips for best practice for the implementation of data policies.

What is data security?

Data security – also known as data protection – concerns the prevention of unauthorized access to data. This includes malicious access by third parties such as cybercriminals and hackers, internal employees or contractor abuse, of course. But it’s also about reducing the risk of human error leading to data breaches.

With so many of today’s businesses pursuing a digital enterprise transformation, there’s never been a more important time to get it right. The typical processes used when implementing data security policies include:

  • Multi-factor authentication
  • Access control
  • Network security
  • Data encryption
  • Activity monitoring
  • Data masking
  • Data erasure
  • Breach response protocols

With robust data security and protection strategies in place, your business can ensure that it avoids data corruption, loss or theft. It can help prevent some of the cloud security mistakes we regularly see that can severely damage an organization’s reputation.

What is data privacy?

Data privacy, on the other hand, is about how the information collected by an organization is used. To be more specific, it relates to the collection, processing, storage and handling of data.

Data privacy management is crucial because businesses and other organizations have a legal responsibility to handle information about customers, employees, and other stakeholders in a secure and sensitive way.

The precise details of data privacy law vary from country to country, but it’s standard for there to be some kind of rule against unauthorized access to data or disclosure of personal information.

Neglecting to meet your responsibilities under the law could lead to your business suffering financial penalties or lawsuits. At the very least, there’s a real danger of your organization taking a severe hit to its reputation. And if that happens, it can be very difficult to rebuild customer trust. So it’s vital to ensure that you take all necessary steps to implement rigorous data privacy protocols.

Differences between data security and data privacy

Although the two are related, they are not the same. There are a couple of important distinctions to be drawn between data security and data privacy.

Different aims in terms of safety

Data security places an emphasis on developing processes and protocols to prevent unauthorized access to data by hackers and other cybercriminals. Meanwhile, data privacy is about controlling who is permitted access to data, what are the permitted use cases, and how to make sure personal information is not misused by defining policies and creating appropriate controls.

Data security, in other words, is a prerequisite for data privacy. But it doesn’t necessarily work the other way around. Theoretically, you could have data security without data privacy. For example, an online retailer could have very robust data security protecting transactions on the payment side of their online shop. But without a working ecommerce privacy policy in place, there’s nothing to stop a dishonest employee from selling on customer data.

Who is legally responsible can be different

We’ve mentioned that the laws in this area do vary considerably from place to place. In many cases, though, the question of who is responsible for data security and data privacy may not be as simple as it first appears.

In most cases, the legal responsibility for data security lies unambiguously with the company or organization storing the data. However, quite often the user is expected to take a degree of responsibility for data privacy themselves.

Users have a large amount of control over the decision of how and where to share their data, and this is generally reflected in legislation. Of course, organizations the user has shared their information with will still be expected to have strict processes in place to protect the privacy of data shared with them.

Best practice for data security and data privacy

There are a number of things you can do to make sure your organization is meeting its responsibilities in storing and handling data.

Keep up to date with the law

This is the most important and most fundamental element of any data security policy or data privacy program. Being aware of which laws apply to your circumstances and following them to the letter is vital.

One important aspect of this issue to bear in mind is that laws in other jurisdictions can apply to you if you do business in that jurisdiction. For example, any US organization dealing with customers resident in the European Union will need to comply with the General Data Protection Regulation (GDPR) rules, which came into force in May 2018.

The law surrounding data security and privacy can be very complex, which leads onto the next point.

Hire professional experts

It’s best to have dedicated legal and IT experts to consult on and implement your policies. Ideally, this should happen at the beginning of the process, while you develop your solutions. Many larger organizations already have the skilled staff available for this task, of course, but smaller ones may need to outsource it.

This may seem expensive, but it could cost you a lot more in the long run if you get it wrong.

Don’t collect unnecessary data

It may be tempting to ask users to provide all sorts of data just in case. But generally speaking, this is not good practice. The more data you collect, the more can go wrong with handling it. Collect only the minimum data you need for your purposes.

One added advantage of this is that applying this principle at scale could save you money on bandwidth and storage costs. It’s also more pleasing for users, as it cuts down on extra fuss.

Automate your processes

Whether it’s about traditional PBX phone systems or the most cutting-edge machine learning tools, automating business processes is the perennial efficiency builder. And it applies just as much in this case.

The more of your data security and privacy tasks you can automate, the lower the risk of human error. It’s not always easy for employees to remember all of the compliance rules they have to stick to, so automating as much of the process as possible takes a lot of the burden off them. As a result, you’ll have fewer data breaches and less stressed staff.

Implement rigorous security procedures

There’s a reason why there are so many different types of reports in software testing. Each has a role to play in creating and maintaining standards in the final product. Using an intelligent mixture of software and network access protocols in your organization is key to making your data security setup a success.

Consider safety tools like multi-factor authentication, access control and data encryption, yes – but don’t overlook more basic necessities. For example, do you have robust procedures in place for updating your antivirus software? Do your staff always use a secure private network, without fail?

Even something as simple as social media can catch out the unwary. It’s best to limit the amount of information you share on social sites, since they can be an entry point for malicious actors.

Limit employee access to data

In today’s fast-moving business environment, where we’re using any number of tools like remote working software or an enterprise VoIP system to communicate, it can be easy to get a little careless with this. Too often, information flies from one part of an organization to another without much thought about how it gets there. Or even whether it needs to get there at all.

The fact is, it’s important to give careful consideration to exactly who needs access to data and who doesn’t. Partly, this is a question of our old friend human error again. The more individuals have access to sensitive information, the more likely it is it could be leaked accidentally.

Make consistent decisions about who needs access to data, and monitor that access. Training employees on issues like consent and preference management can also be useful in getting everyone on board with the process.

Get your employees on board

In fact, training is a good idea all round. Organize regular training courses on data security and data privacy so that everyone is aware of the importance of good practice.

It’s best if all employees have a good understanding of your organization’s policies, so that they remain front and center in your minds during everyday working life. Emphasize the importance of reporting any data breaches early to prevent more serious repercussions later.

Final thoughts

We spend a substantial proportion of our work time dealing with data: recording customer contact details, estimating the cost for an AWS instance type, and generating software test results. It’s easy to lose sight of the fact that safeguarding information is one of the most crucial duties of any modern organization as we focus on day-to-day tasks.

Nevertheless, data is one of the most valuable assets we have. Safeguarding it is not only a legal responsibility, but also key to any business’s reputation. So why not take some time today to review your data policies and make sure they’re the best they can possibly be?

Jessica Day is the Senior Director for Marketing Strategy at Dialpad, a modern business communications platform that takes every kind of conversation to the next level—turning conversations into opportunities. Jessica is an expert in collaborating with multifunctional teams to execute and optimize marketing efforts, for both company and client campaigns. Jessica has also written for other domains such as Data Privacy Manager and Guru.

  • Privacy Law Update

Privacy Law Update: August 1, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!

Newsworthy Updates

CPPA says preemption must not be in any federal bill

The proposed American Data Privacy and Protection Act feels so close and yet so far away. The comprehensive privacy bill is on the cusp of a U.S. House floor vote, a first for any federal privacy proposal. But the bill’s fragile nature is being tested at a crucial point in the legislative process as California lawmakers and stakeholders prefer the bill fail or be refit in order to preserve the California Consumer Privacy Act and its successor, the California Privacy Rights Act.

A look at Canada’s new federal privacy legislation, Bill C-27

In June 2022, while privacy professionals in Canada were still contemplating Bill C-26 on cybersecurity, the much-anticipated Digital Charter Implementation Act, 2022 — Bill C-27 — was introduced by the federal government. It is a reintroduction and, some may agree, an improvement of Bill C-11, first introduced in 2020 and failed on the order paper as a result of the federal election in 2021.

China PIPL: Data export regime starts to take form

China’s Personal Information Protection Law lays out strict limitations on cross-border transfers of personal information (PI). Finally, after more than nine months after the PIPL came into effect, three new regulatory developments will provide guidance on the administrative procedures and detailed rules to implement the cross border transfer rules.

Examining the intersection of data privacy and civil rights

For historically marginalized groups, the right to privacy is a matter of survival. Privacy violations have put these groups at risk of ostracization, discrimination, or even active physical danger. These tensions have long pre-dated the digital age. In the 1950s and 1960s, the government used surveillance programs to target Black Americans fighting against structural racism, with the Federal Bureau of Investigation’s (FBI) Counterintelligence Program (COINTELPRO) targeting Dr. Martin Luther King, Jr. and members of the Black Panther Party. During the HIV/AIDs epidemic, LGBTQ+ individuals were fearful that with an employer-based healthcare system, employers would find out about a doctor’s visit for HIV/ AIDS and that individuals would then face stigma at work or risk losing their jobs.

Data privacy is the future of digital marketing: Here’s how to adapt

The world of digital marketing is approaching a new normal. Consumer privacy is no longer just a movement to monitor, but one that is reshaping the industry through regulation and action by the tech giants. Major brands are now coming to realize that the way they organize, invest, think about audiences, and engage with consumers, will be reorganized around people’s privacy preferences – rendering many traditional digital marketing strategies fundamentally different.

Privacy Legislation

Opposition to ADPPA Intensifies
Senate Commerce Chair Cantwell continues to oppose the American Data Privacy and Protection Act (ADPPA), significantly undermining its chances for further progress in this Congress. In comments to The Spokesman, Senator Cantwell stated that “[i]f you’re charitable, you call it ignorance” regarding the House’s approach to privacy enforcement and suggested that civil rights groups supporting ADPPA have “been infiltrated by people who are trying to push them to support a weak bill.” In separate comments to the Washington Post Cantwell expressed that she is not planning to bring ADPPA to a markup because “I don’t even think Nancy Pelosi has plans to bring it up, so pretty sure we’re not going to be bringing it up.”

Senate Commerce Marks up Child Online Privacy & Safety Legislation 
Mere hours after ADPPA advanced from the House E&C Committee, Senator Cantwell called a Senate Commerce markup of two bills, S.3663, the Kids Online Safety Act (KOSA) (Blumenthal (D-CT) & Blackburn (R-TN)) and COPPA 2.0, the Children and Teens’ Online Privacy Protection Act (Markey (D-MA) & Cassidy (R-LA)). At the July 27 hearing, KOSA advanced unanimously and COPPA 2.0 advanced on a voice vote with some Republicans opposing. Notably, Ranking Member Wicker expressed his support for the ADPPA, opposition to COPPA 2.0, and areas for future improvement in KOSA. Other Republican members expressed concern about the scope of FTC rulemaking authority in the COPPA 2.0 bill.

California Privacy Protection Agency Opposes ADPPA in Special Board Meeting
At a special California Privacy Protection Agency meeting on July 28, the CPPA board unanimously adopted 3 motions related to federal privacy legislation:

  1. Oppose the American Data Privacy and Protection Act as currently drafted
  2. Oppose any federal bill that seeks to preempt the CCPA or establish weaker privacy protections
  3. Authorize Agency staff to support federal privacy protections that do not preempt the CPPA or that create a true “floor” for privacy protections that states can build on in the future.

Board members raised the following concerns about the ADPPA:

  • Chairperson Urban: Expressed concern that the ADPPA would cause Californians to lose the privacy rights they currently enjoy today. She further expressed concern about losing the CCPA (as amended by the CPRA)’s constitutional floor, which she called a direct response to industry efforts to weaken the bill.
  • Board Member Thompson: Stated that the ADPPA represents a “false choice” by treating privacy rights as if they are limited in supply and the wrong argument that Californians’ strong privacy rights must be taken away in order to provide weaker rights federally. He further questioned the ADPPA requirement that unified opt-out signals must be authenticated.
  • Board Member de la Torre: Raised concerns that the ADPPA would jeopardize the ability of California to receive a state-specific EU-adequacy determination. She further raised concerns about the ADPPA overruling privacy laws of local municipalities, not just states. She also argued that the preemptive effect of the ADPPA has not yet been fully explored and that it could strike down laws that protect women in the wake of the Dobbs decision.
  • Board Member Sierra: Raised concerns that the ADPPA would limit the enforcement effectiveness of the Agency.
  • Board Member Le: Cited to CPPA Deputy Director of Policy and Legislation Maureen Mahoney’s Staff Memorandum to argue that the ADPPA is weaker than the CCPA because the ADPPA: (1) would deprive Californians of the right to opt out of automated decisionmaking; (2) covers fewer service providers (excluding those that perform work for government entities); (3) does not clearly cover inferences; and (4) requires impact assessments for fewer types of businesses.
  • Executive Director Soltani: Unequivocally stated that the ADPPA would be weaker than the CCPA on substance. He further argued that California’s existing law is better interoperable with other state and international privacy frameworks.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Marketing
  • Regulations

Data Governance, Metadata Management, and Consent and Preference Management Software

Consumer data drives business. However, pressures from regulators and consumers often complicate the data landscape, making it difficult for organizations to collect and use this information. Data governance, metadata management, and privacy software can function independently of one another or together to create a robust technology stack that drives regulatory compliance, consumer satisfaction, and effective marketing.

Data Governance Software

What is data governance software?

Data governance is the process of managing and organizing data so that it is available, usable, consistent, accurate, and secure at an organization-wide level. Data governance serves as a foundation for effective data collection, analysis, and decision-making.

Data governance software is a tool that allows organizations to organize, manage, and protect the sensitive information they collect and want to use to enhance their business. Some data governance tools can also automate audits, optimize workflows, and demonstrate compliance.

What does data governance software do?

Data governance software benefits include:

  • Data visibility and quality – By connecting and integrating information across systems, data governance software tears down organizational data silos that can lead to inaccuracies. Improved data visibility allows for holistic reporting and decision-making.
  • Data security and compliance – Data governance software monitors regulatory updates in order to flag risks and provide actionable steps to achieve compliance in real time.
  • Automation – Automating audits and other data management processes helps organizations run efficient data governance programs while also saving time and money

Who uses data governance software?

In large, enterprise organizations there is likely a data governance office with a data governance team. In a typical enterprise, a data governance team could be made up of data analysts, compliance specialists, and data governance architects.

Metadata Management Software

What is metadata management software?

Metadata management is a subset of data governance that involves the details that describe collected consumer data. This information typically relates to attributes around the data collection: its origin, current location, owner, access controls, and audit trails. Metadata can inform organizations about the value of the content they collect from consumers and accordingly govern the appropriate use of this data.

What does metadata management software do?

Metadata management software can increase visibility and understanding of a company’s data across various teams and systems. Metadata information can help make the decision easier, promoting efficiency and compliance. The software also lets users edit and oversee data categorization. Metadata management tools can simplify data management and retrieval of this information to increase efficiency.

Without metadata management software, companies will struggle to determine what content they have and how it can be used. As privacy legislation evolves and more and more regulation is put around data, metadata management systems are becoming a critical piece of any organization’s data infrastructure.

Who uses metadata management software?

Metadata management software helps data engineers and IT teams who need to constantly manage and interact with data. From sales and marketing tools to internal HR, metadata management is critical in understanding how data can be used in an organization.

Consent and Preference Management Software

What is consent and preference management software?

Consent and preference management software can work within a data governance framework to help define the ways that consumer and user data can be used across an organization. Some of the best examples allow users to see what kind of information is stored, how it is used, and where it is transferred. Consent and preference management often falls into the privacy software space and can also help automate privacy assessments and data requests.

What does consent and preference management software do?

Consumers want to know that the companies they give their sensitive information to will not abuse the exchange. Consent and preference management software can help companies build the infrastructure that companies need to give consumers the confidence they need to manage this consent.

Who uses consent and preference management software?

Privacy professionals, IT teams, and data governance professionals all play a part in the success of a consent and preference management platform. The software is used as a tool to help implement, automate and enforce the usage of data through an organization.

Privacy Laws are Driving Unification Across Organizations

Privacy laws are making it critical for organizations to truly understand and consolidate the data that they are collecting and how they are using it. Data acquisition, creation, storage, and use can no longer be managed on a team by team or business unit by business unit basis and should adhere to centralized guidelines that meet privacy and regulatory obligations.

Combining the power of data governance, metadata management, and consent and preference management enables organizations to unify, regulate, and utilize data without compromising the trust of its users. When these three types of software work together, companies can confidently use consumer data in an effective and reliable manner. Marketing teams can still do marketing, sales teams can still make sales, and consumers can have trust that your organization respects their rights.

  • Marketing
  • Regulations

Consent – Beyond Compliance

Consent management is no longer optional. Law and regulation have made implementation mandatory with several state laws requiring consumers to opt-in/opt-out of targeted marketing. Consent and preference management is top of mind for brands, publishers, and the whole of the adtech and martech ecosystem. The concerns go well beyond compliance.

Two camps seem to be emerging. One camp views these requirements solely as a burden that impedes their go-to-market capabilities, but with which they are forced to comply. The second sees opportunity and is already positioning for what they view as competitive advantage.

To discuss Consent – Beyond Compliance, Ann Smith, WireWheel Director of Demand Generation moderated a discussion with Arnaud Gouachon, Contentsquare Chief Legal Officer, and Kara Larson, 6sense Principal Privacy & Compliance Counsel at the 2022 Summer Spokes Privacy Technology Conference (held June 22-23).

Consent and preference management is a top-of-mind challenge

The overall context in my view, goes beyond the legislation. Legislation may be the consequence or may be the cause, but it’s only one part of the trend that – whether brands like it or not – is coming. As an actor in this ecosystem, you can decide what you want to do with it, but it’s coming.

—Arnaud Gouachon, Contentsquare

“Consent and preference management is top of mind for a lot of our customers,” says Larson. “We get a lot of questions around ‘what do I need to do for a cookie banner, and how do I allow this to fire?’ We’re also starting to see a lot more questions about consent, especially with respect to the CPRA, and based on the just released draft U.S. federal privacy bill (ADPPA).”

The first question 6sense is asked today is about straight-up compliance. Clients want to understand how to use a product and remain compliant, says Larson. Secondly, they are asked questions concerning compliance across different jurisdictions. Interestingly however, is that– at least with respect to the B2B space – the definitions under the ADPPA exclude some B2B data.

Managing consent “goes beyond legislation and regulation,” opines Gouachon. “Obviously, we’ve seen a lot of new laws and regulations that are becoming progressively stricter. But beyond that there are, at least in the short term, some novel challenges.”

“Most striking to our team was the number of activists that are really passionate about privacy topics and that are getting increasingly active. “I think everyone has heard of nyob, the Max Schrems organization. They recently launched an AI-based automation tool to automatically file complaints with local data protection authorities.”

Add to this the increased scrutiny of marketing practices and fines are getting more significant – particularly concerning cookies, transparency, data controlled by users, and dark patterns – and “with all this”, notes Gouachon, “the tech players, the brands, the tools, and ad tech ecosystem has had some ambiguous responses.

“Some have been very clear about their intent. Others are trying to duck their heads and take a ‘wait and see’ approach.”

A cookieless future

State laws have been creating exemptions for B2B or business-context data. They recognize that there is perhaps a lesser interest in keeping that data private or closely held. It’s the difference between dropping your business card on the ground versus your driver’s license:

In one of those cases, you’re going to run back and try to find it.

—Kara Larson, 6sense

“With respect to the federal privacy legislation,” continues Larson, “currently there is preemptive language in the proposed draft of the bill…so we aren’t looking at a patchwork of different laws and standards.”

“But regardless of what the patchwork of U.S. Privacy law looks like – and whether or not there’s a federal privacy law – we are seeing technology players start to make their own independent moves. Particularly when we get to third-party cookies.”

“We’ve seen Firefox do away with them, Apple and Safari do away with them, and Google has announced plans to also get rid of them. So, when we’re talking about third-party cookies, there’s not even the opportunity to try to collect consent anymore once they go away.”

“I can only agree,” says Gouachon. “The trend is here to stay. It is targeting mostly third-party cookies right now (but first-party cookies may be next).

“Brands and the technology solution providers can wait, or they can try to get ahead of the game. Last year Contentsquare launched our first cookieless solution – it’s not relying on any cookie technology at all (third-party or first-party cookie). This is in response to concerns that we see from some customers and brands. They are in effect already asking for solutions like these ahead of the legislation.”

Less data, more relevance for a better UX

The advent of cookies was a seismic shift for marketers

The amount of data and tracking you could do was enormous and no one at any point stopped to ask ‘do we need all of this data? If someone is shopping for shoes, do I really need to know that they drive a Kia and have a dog?’

There hasn’t been a lot of internal reflection about how useful it is.

—Kara Larson, 6sense

There is going to be a shift, offers Larson. Not necessarily one-to-one replacement but rather a holistic approach. “Maybe you’re concerned about display ad reach, specifically with targeting,” absent cookies. To compensate you can start to look at “relationships with these so-called ‘walled gardens’ like LinkedIn and Facebook because they are not relying on third-party cookies to do that identity resolution. You’re also looking at alternative identifiers.”

Several solution providers like LiveRamp are trying to solve this, notes Larson. In another approach, 6sense is making contextual marketing available to its customers – ads relevant based on what a consumer is currently viewing on a particular webpage. (If someone is shopping for shoes, you can forget about the dog.)

“What resonates most with me is the types of data brands are collecting about users,” responds Gouachon. At Contentsquare, “we think there is a way to personalize your approach and experience without compromising privacy.”

“How much demographic information do you really need about your users?” In a [brick and mortar] store a shopper isn’t asked dozens of questions about their personal information before a salesperson helps them, notes Gouachon. “So, we are really trying to replicate that experience and come as close to that experience as possible.”

There’s a misconception that online visitors are invisible. Not true. They give you real time feedback through all of their digital interactions without having to know anything else.

Understanding what they are trying to achieve online and how they want to go about it is really the key to delivering a superior customer experience.

—Arnaud Gouachon, Contentsquare

This is what our product team calls an intent-based approach, explains Gouachon. “We’re really interested in fixing what’s not working for them in their online journey and addressing those aspects that are frustrating the users and moving them away from their goal.

“While complying with legislation is table stakes – a no brainer – it’s probably much more valuable to innovate solutions that anticipate people’s desire for privacy. That trend is here to stay and it’s not just a legal or regulatory trend…. It’s a desire that most users have.”

Positioning for the future of privacy compliance

As discussed here, marketing and privacy have been on a collision course for some time now. But the deprecation of cookies does not have to be a win-lose scenario. It can be win-win. (Apple and others clearly see it that way.)

Indeed, there are benefits to gain with a cookieless approach. The benefit is really “an opportunity to build a trusted relationship with customers, partners, end users, employees, regulators, and NGO’s,” says Gouachon. “What we sometimes call ‘digital trust’”

While there are many initiatives brands and stakeholders can implement, it begins with transparency insists Gouachon. “Moving from checkboxes and cookie consent banners to a more centralized and user-friendly ‘privacy center’ approach.”

“It’s not compliance for compliance’s sake,” avers Larson. It’s about how you want to position yourself and your brand. Do you want to be at the forefront in thinking about these issues before they occur? Before they become a problem? Or do you want to spend all of your time trying to play catch up?”

Anything you do to increase the transparency of what’s happening with data – how it’s going to be treated, how it is going to be protected – raises that overall trust level.

And that’s how you start to flip privacy from roadblock to asset that builds your brand.

—Kara Larson, 6sense

This requires technology that “can respect and respond to a global privacy setting or ‘do not track’ signal, cautions Larson. “And we have seen with these proposed draft regulations for the CPRA that they are doubling down on the requirement to respect those signals.” The ability to do that effectively is, for Larson, “a key differentiator for a consent and preference management platform.”

She warns that trying to do this manually is a nonstarter. Automation is prerequisite not just to compliance but going beyond compliance and thriving in the fast-approaching world of consumer-focused transparency and trust.

Consent is about more than cookies and brands are looking at ways to get ahead of compliance in order to turn these challenges into opportunities. Looking to have more conversation around consent and the future of privacy? Let’s talk.

Listen to the session audio

  • Privacy
  • Regulations

What to Expect from Privacy Laws in 2023

Moderator Michael Hahn EVP and General Counsel of IAB and IAB Tech Lab, brought together a panel at the 2022 Summer Spokes Privacy Technology Conference (held June 22-23) to discuss the impact of various State privacy laws on digital advertising activities.

Joining Hahn for the session “What to Expect in 2023” which offers practical guidance on managing compliance are WireWheel Founder and CEO, Justin Antonipillai; Sundeep Kapur, Senior Associate Cyber, Privacy & Data Innovation at Orrick; and, Crystal Skelton, Senior Corporate Counsel, ZipRecruiter®.

One button or two?

It depends on the ways in which a business is selling or sharing data. “If selling data in ways beyond the sharing or targeted advertising,” says Skelton,” then it might make sense for a company to offer two. But if they’re conducting solely targeted advertising, I would expect to see one.”

“But It’s also a balancing act. I often wonder whether having two separate links could make it less likely that consumers will exercise both of their opt-out rights. I do appreciate that the draft CPRA regulations provide an alternative opt-out link option…but, as many of you already know, there are additional requirements that come with that, including the use of an icon.”

“It remains to be seen whether companies will more broadly adopt the icon with the alternative opt-out or choose to offer one or more links on their website. Having this more neutral language helps provide flexibility, especially in states that have similar requirements but aren’t as prescriptive about the language that must be used.”

“Just don’t make it look like GDPR in Europe”

As much as our community spends time understanding what targeted advertising and cross-contextual behavioral advertising is, it’s not the kind of thing you can just show up at your family reunion and everybody knows what you’re talking about. And we’re seeing a lot of focus on how to make this easy to understand.

—Justin Antonipillai, WireWheel

“There’s a lot of hesitancy about putting on a link that says, ‘do not sell my personal information’ or ‘do not sell or share,’” says Kapur. “Providing privacy options seems a more brand-friendly way of providing that experience.” The link can be used for multi-state compliance, but it depends on what you want to combine.

“If you’re a publisher, maybe you want one link, and you’ll just drop the traffic if they opt-out. Same with advertisers. If you’re using internal data, maybe you want two: one for targeted advertising and one for sale and sharing because you’re relying on first-party data.”

“Request number one,” says Antonipillai, “when implementing consent mechanisms – especially for companies who have experience abroad — is ‘I don’t want this to feel and look like it does in Europe.’ A customer experience that is “really difficult to do almost anything when you go to a website or an APP.” Companies in the U.S. feel that approach will cause people to disengage.

“We hear a lot of concern around making the user experience too complicated,” relates Antonipillai. “But if you start to abstract a lot of the individual choices that you get to a lot of different individual choices that might need explanation.”

The New Complexity of Consent Choice

The question becomes ‘how do we make it simple and create a frictionless user experience.

Legal Consent is definitely NOT a one-size-fits-all solution

How does ‘do not sell’ differ from ‘do not share’ or ‘targeted advertising’ as those terms are used in CPRA and the other state laws?” asks Hahn. “There is quite a bit of overlap.”

‘Do not share’ are disclosures specifically for cross-context behavioral advertising. That is to say, targeted advertising based off data obtained across non-affiliated digital properties. And any disclosure for that purpose requires a ‘do not share’ opt-out mechanism.

—Sundeep Kapur, Orrick

Whereas Kapur notes, ‘do not sell’ is “any disclosure for any sort of consideration.”

There are some differences. Do not sell is narrower. As a simplified example, Kapur offers if you’re sharing data with a measurement provider – measuring campaign effectiveness – that may not be considered a share but could still be considered a sale if you don’t have a service provider agreement in place.

“For targeted advertising, the CPRA uses ‘do not share my data for cross-context behavioral advertising,’ while the non-CPRA laws have an opt-out for the processing of personal data for targeted advertising. They don’t focus on the disclosure of data for targeted advertising, but more generally the processing of it.”

And that’s not “just disclosing data through the bitstream,” advises Kapur, “which could be a share and also processing for targeted advertising.” It also impacts “publishers that use a combination of third-party data and their own data, and the targeted advertising opt-out would cover that.”

“So, in a bit of an ironic way, the opt-out for targeted advertising under laws like Virginia and Colorado are actually broader than the ‘do not share’ under California,” says Kapur.

“It’s a really important point to emphasize that the scope of targeted advertising is broader than the opt-out for share-in furtherance of cross-context behavioral advertising.”

This answers the contention that I hear all the time, which is ‘California, must be the most stringent laws, so if I just comply with that, I must be good everywhere else.’ There is a fallacy in both the premise and the conclusion because actually the other laws are broader.

—Michael Hahn, IAB

“It’s definitely not one-size-fits-all. That’s the issue with having state-by-state privacy laws,” says Kapur.

Why do privacy laws have such a broad definition of “sale” of information?

“If you have a broad definition of ‘sale,’ with disclosures for monetary or other valuable consideration,” opines Kapur, “one could make an argument that when data is sent to a random ad server or across the pond to an adtech partner,” there is no valuable consideration there. “It is just disclosure.”

Ultimately, the broad definition of “sale” is to ensure disclosures for cross-contextual advertising and that the business has some sort of opt-in mechanism.

I recently read an interview with Alistair McTaggart stating that too many industry attorneys were taking the narrow view of sale (which was really never sustainable). McTaggart saw that position being taken, so put in the ballot initiative this new concept which has been copied into all other state laws.

It wasn’t the most rational conclusion from a drafting standpoint, but it was a result of not the most rational approach being taken by certain corridors of industry.

—Michael Hahn, IAB

“The primary difference under the definitions is whether it includes valuable consideration in addition to monetary consideration,” notes Skelton.

“In California, Colorado, and Connecticut, valuable consideration is included in the definition, whereas in Virginia and Utah, it’s not.” All States, however, include some sort of separate targeted advertising or cross-contextual behavioral advertising component.

It’s an interesting place to be in right now because you want to potentially have a single mechanism to comply across the board. But you’re essentially playing whack-a-mole when you get these various state laws with different definitions, components, and requirements. It can be a precarious place to find yourself when you’re thinking about across-the-board compliance on a state-by-state basis.

—Crystal Skelton, ZipRecruiter

New contractual obligations under US privacy laws

“It’s not sustainable to have separate contracts for separate jurisdictions,” states Skelton. “For example, often, when you’re doing targeted advertising, you’re targeting consumers nationwide (or you’re using third parties to do so) and not necessarily using a state-by-state approach.

Updating privacy and data security addendum templates to include the greatest common denominator to address all these State requirements may be a good approach, at least to start with, but you are going to have to navigate those specific differences in definitions and compliance requirements.

Keep in mind,” cautions Skelton, that “under the CPRA draft regulations are due diligence requirements for service providers and contractors. How can one reasonably do that in order to rely on the liability defense under the CPRA?”

“In some cases, it’s very difficult (though not impossible),” offers Kapur, “to get the right contractual privity. Certainly, when talking about the adtech ecosystem. For example, under the CPRA there is a requirement where if you are sending data to a third-party (aka, a non-service provider) you need have a contract in place with that third-party describing the nature of the sale/share and other information.”

In some cases, if we take the broad view – which is certainly the view that regulars have been taking – looking into the nitty-gritty of the ecosystem, it can be really difficult to get where everyone can sign on to something without some sort of industry-wide mechanism

—Sundeep Kapur, Orrick

“For example, when you’re pinging a third-party advertiser ad server, that discloses personal information plus an IP address. If we’re going to take the broad approach and err on the side of caution, how do we get an agreement? There’s definitely an issue there.”

Liability, compliance, and diligence

“Just when you thought you knew the law, you didn’t,” says Hahn. “You thought you didn’t have liability for your partners, unless you had knowledge – or reasonably should have known – what they were doing. But now you don’t have that insulation unless you’re doing diligence.

It took a little digesting just to wrap my head around what does this actually mean in practice. The sheer scope of what is potentially required by this. It not only goes through the procurement process. You’re looking at new vendors and your agencies.

—Crystal Skelton, ZipRecruiter

“You have to set up a regular cadence for review…it’s a tough position to try to be in. How do you start tackling this? Do you put in place these due diligence requirements now, or do you take a wait and see approach? These are draft regulations that may change. And it’s a significant burden,” opines Skelton.

I have been hearing about a few different approaches, says Hahn:

  1. An entirely unique experience with respect to each state
  2. Treating California consumers one way and then creating a common experience that complies with all the other laws (as those laws have greater commonality with each other than they do to California), and
  3. Not determining location or residency of anyone who comes to the site and taking a national approach. Try to create a common set of baselines that will (hopefully) comply with all of the laws.

“A very important voice in this entire process is the CMO and the head of digital marketing who are trying to think through the customer experience,” opines Antonipillai.

Even when one explains what the choices are supposed to be to the consumer, and you start trying to make it simple, it comes across very confusingly. It’s exceptionally hard to explain what the consumer’s choices are. Even to an expert audience.

I see a lot of motion towards simplicity – trying to get a good consumer experience.

—Justin Antonipillai, WireWheel

Antonipillai goes on to note that WireWheel has been helping clients implement due diligence requirements for some time, “but I wouldn’t have guessed that it would have to be for everybody under all circumstances. That’s a huge undertaking.”

Importantly, says Antonipillai, the draft CPRA regulations “suggest that California is generally an opt-out place. However, if you’re using data in a way that’s not reasonable and proportional to the way that the consumer believed it would be used it almost starts to suggest that it becomes opt-in.”

This too makes the consumer experience very tricky. And tricky for business.

Looking to learn more about what is coming in 2023? Let us help you in your compliance journey.

Listen to the session audio

  • Privacy Law Update

Privacy Law Update: July 25, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!

Newsworthy Updates

American Data Privacy and Protection Act heads for US House Floor

Despite facing a time crunch, a flood of stakeholder feedback and unforeseen Congressional opposition, the proposed American Data Privacy and Protection Act keeps on chugging.  The bill’s next act will come in the U.S. House floor after the House Committee on Energy and Commerce markup July 20 resulted in a 53-2 vote to advance the bill to full House consideration. The vote to advance marks the first time a comprehensive privacy bill will be made available for a full chamber vote in either the House or the Senate.

State Attorneys General Oppose Preemption In Proposed American Data Privacy And Protection Act

California Attorney General Rob Bonta today led a coalition of ten attorneys general in urging Congress to respect the role of states to enforce and provide for strong consumer privacy laws while advancing legislation enacting long-overdue privacy protections nationwide. The states call on Congress to create a baseline of consumer privacy laws that do not preempt states’ ability to respond with legislation to address changing technology and data protection practices. Numerous states already have strong privacy protections in place — including California — and state laws and enforcement are critical to protect consumers and their data.

How UK Data Protection Bill Stacks Up With EU GDPR, ePrivacy Framework

On July 18, 2022, the UK government introduced the Data Protection and Digital Information Bill ‘DPDI Bill’ to Parliament. Previously known as the Data Reform Bill, it is the result of a consultation from 2021 and its aim is to update and simplify the U.K.’s data protection framework. According to the U.K. government, the new legal framework created by the DPDI Bill will reduce burdens on organizations while maintaining high data protection standards.

How Canada’s CCPA Differs From PIPEDA

In 2020, Shaun Brown wrote about what he considered a significant flaw under the proposed Consumer Privacy Protection Act in Bill C-11, which was tabled in November 2020, and then died when the federal election was called in 2021. Bill C-11 retained the definition of personal information — information about an identifiable individual — but introduced a new concept of “deidentify.” This seemed to, by implication, alter the concept of personal information, expanding the scope of federal privacy legislation and tossing away years of judicial guidance in the process. Bill C-27 would do this as well, though in a slightly more complicated way.

CAC Readies $1b Fine For Data Security Violations

The Cyberspace Administration of China plans to fine Chinese ride-hailing company Didi Chuxing more than $1 billion in relation to alleged insufficient data security practices, The Wall Street Journal reports. The fine is the last remedial step Didi faces as part of a yearlong investigation by the CAC, which removed the company’s mobile applications from China’s app stores over data security concerns in July 2021. Payment of the fine would restore Didi apps and allow the company to begin a new share listing in Hong Kong.

Different Approaches to Data Privacy: Why EU-US Privacy Alignment in the Months To Come Is Inevitable

Even though it is hardly disputable that origins of modern data privacy, as well as computer technology, are to be found in the US, it is currently the EU with its GDPR that sets the global tone in terms of what is the generally accepted privacy standard, especially for multinational companies operating worldwide.

Examining the Intersection of Data Privacy and Civil Rights

For historically marginalized groups, the right to privacy is a matter of survival. Privacy violations have put these groups at risk of ostracization, discrimination, or even active physical danger. These tensions have long pre-dated the digital age. In the 1950s and 1960s, the government used surveillance programs to target Black Americans fighting against structural racism, with the Federal Bureau of Investigation’s (FBI) Counterintelligence Program (COINTELPRO) targeting Dr. Martin Luther King, Jr. and members of the Black Panther Party. During the HIV/AIDs epidemic, LGBTQ+ individuals were fearful that with an employer-based healthcare system, employers would find out about a doctor’s visit for HIV/ AIDS and that individuals would then face stigma at work or risk losing their jobs.

Privacy Legislation

American Data Privacy and Protection Act: On July 29, the House Energy & Commerce committee voted to advance the American Data Privacy and Protection Act to the full House by a 53-2 vote. The only nays were California Representatives Eshoo (D-CA) and Barragán (D-CA).

The Committee considered a number of amendments to the ADPPA, summarized below (in order of appearance):

  • The overarching Amendment in the Nature of a Substitute from Chair Pallone and Ranking Member McMorris Rodgers (discussed in yesterday’s message). The AINS was adopted by voice vote.
  • An amendment from Reps Lesko (R-AZ) and Kuster (D-NH) to exclude NCMEC (National Center for Missing & Exploited Children) from the Act was adopted by voice vote.
  • An amendment from Reps Trahan (D-MA) and Bucshon (R-IN) intended to clarify the ‘permissible purpose’ for sharing data for conducting public interest research was adopted by voice vote.
  • An amendment from Reps Castor (D-FL) and Walberg (R-MI) expanding ADPPA’s ‘Privacy by Design’ requirements to identify, assess, and mitigate privacy risks to minors in an age-appropriate way was adopted by voice vote.
  • An amendment from Reps McNerney (D-CA) and Curtis (R-UT) authorizing the FTC to promulgate regulations (in consultation with NIST) establishing processes for complying with the ADPPA’s data security requirements was adopted by voice vote.
  • An amendment from Reps Carter (R-GA) and Craig (D-MN) reinserting requirements for covered entities to appoint data privacy and security officers (but exempting businesses with under 15 employees) was adopted by voice vote.
  • An amendment from Reps Hudson (R-NC) and O’Halleran (D-AZ) reinserting revised language on service providers and third parties was adopted by voice vote.
  • An amendment from Rep. Eshoo (D-CA) that would limit ADPPA’s preemptive effect to only provisions of state laws inconsistent with the Act failed by a 8 – 48 vote.
  • An amendment from Rep. Walberg (R-MI) that would expand ADPPA carveouts applicable to small businesses was offered and withdrawn.
  • An amendment from Rep. Hudson (R-NC) that would explicitly provide that ADPPA covered entities will not be covered by FCC privacy laws and regulations was offered and withdrawn.
  • An amendment from Rep. Curtis (R-UT) focused on advertising that would provide, in part, that the definition of “targeted advertising” does not include “first party advertising or marketing” was offered and withdrawn.
  • An amendment from Rep. Long (R-MO) that would strike the ADPPA’s explicit grant of enforcement authority to the California Privacy Protection Agency (seemingly based on a concern that it could provide California a preeminent role in the interpretation and implementation of the ADPPA) was offered and withdrawn.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Regulations

Preparing for Federal Privacy Law Regulations Coming Down the Pipeline

We’ve all discussed the five state laws, but there is, for business, a great desire for a federal law, just because it’s way too complicated to manage so many different states, never mind the extra-territorial stuff. There is a desire for things to be easier and more favorable to business. It is a push-pull and it’s a very difficult line to walk.

—Susan Raab, Customer Data Platform Institute

Indeed. There has been much discussion concerning state privacy law, their similarities, differences, and strategies for managing what’s coming in 2023 and beyond. Here we turn our attention to the emerging federal privacy law which is gaining unprecedented momentum. To discuss Preparing for Federal Regulations Coming Down the Pipeline, a panel of experts joined moderator Cobun Zweifel-Keegan, D.C. Managing Director, IAPP at the Spokes Technology Conference (held June 22-23).

Zweifel-Keegan was joined by BDO Privacy and Data Protection Director Jeremy Berkowitz; Susan Raab, Managing Partner, Customer Data Platform Institute; and Jessica L. Rich, Kelley Drye, Of Counsel who was previously with the FTC for 26 years.

Federal privacy law backdrop

“Privacy has been on this country’s radar in a big way since the late 90s. We don’t have a comprehensive federal privacy law. Instead, we have sector-specific laws that apply to a particular market sectors, entities, or data like COPPA, Gramm-Leach-Bliley, FCRA, and FERPA” notes Rich.

“The law with the broadest coverage is the Federal Trade Communications Act which broadly prohibits unfair deceptive practices, including privacy and data security which apply to a very broad array of entities.”

What many don’t realize is that the Federal Trade Commission (FTC) also has jurisdiction in areas where other agencies play a role notes Rich. “For example, HIPAA is enforced by HHS and covers Health entities, but the FTC has jurisdiction over many of those same entities. The same is true regarding COPPA and FERPA covered entities.”

But, the FTC Act, which is the main privacy law in this country, doesn’t set forth privacy standards. It only allows the agency to act after the fact and determine whether something’s unfair and deceptive. There are gaps in jurisdiction, gaps in remedies, and the FTC has very limited rulemaking capacity.

—Jessica L. Rich, Kelley Drye

“The result is 20-years of ongoing Congressional debate about whether to pass a federal privacy law.”

Rich notes that the FTC is considering rulemaking using its “very cumbersome authority,” viewing privacy through both a competition and a consumer protection lens with a lot of focus on leveling the playing field between big and small companies. And using more substantive (and prescriptive) provisions like limiting the use of data (rather than the notice and choice approach).

What are we likely to see in terms of enforcement for federal privacy laws?

“I want to bring in the context of what we’re now calling the ADPPA: the American Data Privacy and Protection Act” (H.R. 8152). “That’s where all the attention has been in Congress over the last couple of weeks, says Berkowitz, “and it is remarkable to see a bill right now, where there seems to be a large consensus between both parties on what should be in this bill.”

That said, the history of attempts by the States, as noted here, demonstrates both cross-party cooperation and “intra-party strife. Both blue states and red states were not able to reach accord on this issue despite a strong desire from all concerned to pass legislation.”

And indeed, as Berkowitz goes on to note, “there seems to be a lot of consensus, particularly on some issues around preemption and the private right of action. However, in the Senate, committee chairperson, Senator Maria Cantwell (State of Washington) is not yet on board with this bill. She hasn’t come out against it, but of all the players in both houses on the relevant committees, she’s the only one who has not signed off on this yet.”

New authority and consensus for the FTC

…getting money, money, money whatever way they can. By partnering with the states or alleging rule violations in creative ways, because you can get money when you allege rule violations, not just Section 5.

—Jessica Rich, Kelley Drye

Notably, the ADPPA provides a lot of new authority to the FTC including 1) a new bureau and staff to the FTC to be able to enforce the act, 2) the ability to be able to promulgate rules, particularly around data minimization and consumer requests requirements, and 3) a requirement for companies to certify once a year that they have a CPO and DPO.

“This bill – at least its current form – is going to provide a lot of that much needed authority,” opines Rich, and notes that FTC Chair Kahn has a more power now “to push through her agenda. Now that she has a majority, you’re probably going to see some more aggressive action over the coming months, regardless of whether this law gets passed or not.”

Rich further notes that FTC enforcement is both expanding and taking broad interpretations of existing laws. For example, “they basically took a very narrow rule – the health breach notification rule – and said it applied to every health APP.

One of the things that I would emphasize, though, is so far, this has all been through settlements. There are a lot of very good arguments that companies could make as to why some of this stuff goes beyond the FTC’s authority.

But most companies, given the cost of litigation, are going to settle.

—Jessica Rich, Kelley Drye

“But I think it’ll get a lot more interesting,” suggests Rich, “if companies start pushing back on some of these aggressive remedies.”

Children’s privacy likely to play key role in new federal privacy law

Zweifel-Keegan notes that historically, the FTC has been highly active on children’s privacy. The Children’s Online Privacy Protection Act granted specific rulemaking and enforcement authority over children’s privacy issues. And in 2019 the FTC began the process of updating COPPA.

“Children’s privacy is its own animal, says Raab. “It’s a sector but it’s also a kind of microcosm of privacy because it ties in with everything: education, healthcare, sports, all of it.”

But what is particularly unique about children’s privacy is “who’s allowed to give consent and what’s allowed to be held.”

As any parent knows, children’s data starts to be put out there before a child is born and on through their life. Data for which they had no input and were incapable of giving consent. At what point does the child or an individual get control of their own personal data that’s been out in the universe?

—Susan Raab, Customer Data Platform Institute

“The reason Education Technology (EdTech) is so important is that it is a giant black hole when it comes to protecting children’s data. There are lots of ways in, and a lot of people know it. It’s the weakest link in the data ecosystem.”

Importantly, reminds Raab, “if you can get children’s data, you can get the whole family’s data as well. A lot of places think they don’t need to worry because they don’t deal with children. But in fact, every company holds children’s data, if in no other place but in human resources, so it’s very complicated.”

“From a legislative perspective,” says Berkowitz, Senate Commerce Committee members Richard Blumenthal and Marsha Blackburn could not be further apart…but have come together on the Kid’s Online Safety Act (S.3663) that was introduced in February.

Raab points out that Senators Markey and Cassidy also have their own bills that looks to evolve COPPA. “It’s going to be interesting to see where things go from here.”

Teens don’t care about privacy

Part of the Children’s and Teen Online Privacy Protection Act (S.1628), notes Raab, is to “capture the tween category the 13 to 15 category. But even trying to manage the privacy needs for young people of one age are different than the need over time.

“Once you start to hit ages where youth can participate in it, where the children give consent, and you know this migrates on through. With children, you always have different gatekeepers whether it’s a teacher or a parent or caregiver. Plus, you have the children themselves.

I was involved in COPPA in its early stages and later. The FTC – not that it had control over this –when asked whether COPPA should be extended to teens said the consent model doesn’t really work well with teens because they don’t care.

—Jessica Rich, Kelly Drye

“They certainly don’t want their parents giving consent for them,” continues Rich, “so when it came to actually standing COPPA, there wasn’t a lot of support for it. Now we have these bills that are more about age-appropriate codes: don’t serve up toxic content to teens and needing to ‘know’ who you’re dealing with.

“Maybe get consent from them or their parents for certain uses of data, but it’s complicated and it’s not easy when it comes to teens. “I do think it’s important that any protections that apply to adults in a large bill like the ADPPA apply to everyone and that there’s additional protections for teens and kids.”

The knowledge standard

“There was a hearing…an active discussion about the ‘knowledge standard’ and how a company can know they’re dealing with a child or a teen, relates Rich.

“And what they put out [in the ADPPA] is a draft that settled on ‘knowing.’” What knowing means exactly is a concept, as Rich notes, on which the courts have opined.

“This gets to what companies can know, what do they want to know, and, sometimes, what they want to know is not so much if it gets in the way of what they want to do,” said Jessica Rich.

When you’re thinking about the children’s components and some of the new rules that are going to be around duty of loyalty and data minimization, there is a consensus that this is a growing problem. We also need to think about how we want to manage it from a from a risk perspective.

—Jeremy Berkowitz, BDO

How all of this plays out, and when (and if) federal privacy laws finally arrive, one thing is certain. “In terms of the trajectory of how far we’ve come it’s incredible,” opines Rich. “Privacy has arrived for Republicans and Democrats. For businesses and for consumers, it’s a huge shift.”

Listen to the session audio

  • Privacy
  • Regulations

How To Become A Chief Privacy Officer (CPO)

The Privacy job market continues to accelerate. There are more entrants every day and robust movement between companies. Privacy is now recognized by many companies as not simply a compliance requirement, but rather increasingly an area of competitive differentiation critical to business success. As such, those accumulating requisite experience and know-how are sought after and moving up the ranks as programs mature and capture increased funding.

One thing is clear, there has been no single pathway to privacy or its top slot. The road for today’s leading privacy professionals is typically a winding, forked, anything but straight path. This is evidenced by the many panelists and attendees at the Summer 2022 Spokes Privacy Technology Conference, (held June 22-23) and every year since its inception.

So, what does it take to secure the top slot? How does one become a Chief Privacy Officer (CPO)?

Host, George Ratcliffe, Head of Data Privacy and GRC Search at executive search firm Stott & May was joined by VMware’s VP and CPO Stuart Lee and Noga Rosenthal, CPO and General Counsel of adtech company, Ampersand to discuss How to Become a CPO and provide sought-after insight.

Key CPO attributes

“Privacy is one of those fields where you really need to have a broad understanding. It’s a renaissance field in every sense of the term. As a privacy leader, you will be pulled into meetings from HR, sales, and marketing to IT and Customer Success. You really need to be able to understand the different needs and areas of the business.

—Stuart Lee, VMware

As a result, a successful privacy leader needs to be able to speak the language of the business and industry that you’re in to really help get your point across.”

And of course, as Lee notes, you need a deep understanding of what the privacy requirements are, how they playout globally, and how they impact your business. “There really is a lot there.”

Importantly, “it’s not just enough to be able to recite law verbatim. You need to understand how you can communicate those requirements back to your stakeholders” while maximizing value for the company and doing right by the customer.

“I think a lot of us here today, and indeed anybody who’s a CPO, DPO, or privacy leader in any way, walks that tightrope every single day: how do we make sure we’re meeting [customer] requirements and expectations, while helping our business to do what it needs to do.”

For Rosenthal, the soft skills are critical. “Being able to speak clearly and make things understandable to everyone, and not speaking in such a high level that nobody has any idea what you’re talking about.

I remember the first time I said to my marketing team ‘Hey, we I don’t want to use the word anonymous; can we use the word pseudonymous?’ And they all just looked at me like ‘I can’t even spell that, what are you talking about?’

—Noga Rosenthal, Ampersand

The ability to be flexible and deal with ambiguity are next on her list of critical “soft skills.” She cautions that what can make the job so difficult is that the laws aren’t very clear. “We have to use our instincts, we have to use benchmarking, we have to look at risk. And sometimes people want a job that’s very clear cut with drawn lines. This is not that job.”

Lastly and perhaps most importantly, it’s about building trust.

Are certifications really necessary?

Rosenthal has a significant history as a privacy professional, and “It was interesting to me.” She relates that as “I was going to DC, speaking on panels, I was applying for a job, and they asked me for it for my CIPP certification. I thought, ‘really?’” To be clear, she does think certifications are helpful. “It’s a checkbox.”

“There’s no replacing grey hair,” says Lee. “You just have to have the experience of going through a lot of the [privacy] exercises, because privacy has long been principle based and when you’re interpreting principles and applying it to what you do, then that often based on the experience of what you’ve seen work and fail.”

—Stuart Lee, VMware

Lee tells the people he works with that “it’s great to have the certifications, but if you don’t have the experience to show that you know what that means it doesn’t really help. That’s the key part.”

Stuart notes that years ago it was the Data Privacy Officer (DPO) that was “the first person invited to any party, but in the event of a regulatory investigation or an incident your DPO became the host of the party.”

The privacy lead is the one involved in instant response and working with regulators. “Trust is incredibly important and it’s a 360-degree relationship with your stakeholders, customers, and the regulators. There is no replacing experience.”

Ultimately, while credentialing may help at the start, becoming a Chief Privacy Officer (CPO) is about experience.

What is the executive recruiter looking for?

“The all-arounder,” says Ratcliffe. “We get a lot of questions about the lawyer v non-lawyer debate but setting that question aside for the moment, “it’s absolutely somebody that can cover all bases. And somebody that is an excellent communicator can build that trust.” Interestingly, says Ratcliffe, “the conversations around the softer skills go on for far longer and are far higher up the agenda than the harder skills that come to you later on around the technology side.

The softer skills and the cultural alignment between the company and the individual takes up 75% of the conversation. And is the role right for you or is it just the title?

—George Ratcliffe, Scott & May

It’s not enough just to say I’ve managed DSARs, and I’ve done X number of reviews. It’s how have you actually made the process something that is really a strategic initiative. Really understanding how those are going to impact the business.

Ultimately, “when we’re looking at executives, whatever the industry, being able to tie in business objectives and goals [in terms relatable to the various stakeholders] is super important,” says Ratcliffe.

Clearly, the CPO role requires the ability to relate the arcana (and nuances) of privacy to your stakeholders across the organization – from sales and marketing to IT and operations, from executive management to your consumers to gain the buy-in necessary and be successful. This takes exceptional communication skill.

“The harder skills,” opines Ratcliffe “are easier to develop with certification training courses and the other types of things that you should be picking up naturally as you develop throughout your career.”

Lawyer versus non-lawyer

A legal background has “certainly been the path of least resistance for filling the role of a privacy officer,” offers Lee.

“Where it becomes super interesting is when you really think about what your chief privacy officer is charged to do, versus what a data protection officer is charged to do, and what counsel is provided to do.

You’ve done the education on the CIPP; you have the experience of doing privacy risk management and DSARs – often working directly with business. None of those things that you picked up along the way you’ve learned by completing a legal education. You’ve got it through your experience in the field.

—Stuart Lee, VMware

That said, continues Lee, you absolutely have to have a legal counterpart. “I would also argue that if you’re a CPO, who is a lawyer, you should have a really good business operations person with you as well,” suggests Lee.

In fact, when you look at Data Privacy Officers (DPOs) who have been around much longer than CPOs research showed that approximately 28% were lawyers, and another 28% were IT professionals, notes Lee. “There’s a huge kind of balancing act and It’s often determined based on the industry you’re in. if you’re in a very highly regulated industry it will likely lean more towards favoring lawyers as a CPO.”

As a lawyer, and playing devil’s advocate, Rosenthal counters, “of course, it should be an attorney because you’re taking laws, interpreting them, and that’s what legal should be doing” rather than breaking the role in two and having privacy go to legal to get the interpretation.

“Another piece to consider” offers Rosenthal “is you’re doing contract negotiations. You want it to be an attorney. When you’re negotiating, you’re usually negotiating with another attorney, so you need two attorneys talking to each other, though that’s not always the case.”

However, some of the strongest privacy CPOs out there are not attorneys.

—Noga Rosenthal, Ampersand

So are there challenges for someone coming from a legal background asks Ratcliffe.

“I’ve had attorneys work for me from the commercial side switch over to privacy and the greatest struggle” says Rosenthal is the lack of clarity in the law. And “they don’t get that.” She further points to the disadvantage stemming from a lack of knowledge regarding things like cookies and browsers.

“Your IT guy knows all your networks, all your systems, and that’s a huge advantage. You can jump right in and understand where the data is coming from.” So conversely, here is where the Lawyer does have to go elsewhere.

Lawyer or not – privilege, contract negotiations, and interpretation of the law notwithstanding – it is fair to say that the CPO can’t go it alone. Allrounder or not. But the emerging law in the U.S. is clear: whatever your background, a CPO must be “qualified.”

Listen to the session audio

  • Privacy Tech

Privacy Law Impacts to AI and High-Risk Processing

AI and High-Risk Processing

Day one of WireWheel’s Summer 2022 Spokes Data Privacy Technology Conference (held June 2-3) featured the discussion AI and High Risk Processing focusing on issues concerning regulation and development of privacy law concerning artificial intelligence (AI) and automated decision making.

Moderator Lee Matheson, Senior Counsel for Global Privacy at the Future of Privacy Forum was joined by several leading experts including his colleague Bertram Lee, Sr. Counsel on Data and AI Future of Privacy Forum. Notably, Lee very recently gave testimony before the House Commerce Committee on the newly introduced Proposal for American Federal Data Privacy Law.

Also joining was King & Spalding LLP Data Privacy & Security Partner Jarno Vanto and Christina Montgomery, Vice President, CPO, and Chair of the AI Ethics Board at IBM.

Montgomery also serves as a member of the Commission for Artificial Intelligence Competitiveness and Inclusiveness as part of the U.S. Chamber of Commerce and has recently been appointed to the United States Department of Commerce’s National Artificial Intelligence Advisory Committee.

The Artificial Intelligence (AI) regulatory state of play

In terms of how artificial intelligence is starting to get regulated we’re seeing the world fall into three different camps:

  1. The more prescriptive approach, adopted by the European Union, where you have a regulation on what types of AI cannot exist, what types of AI are high risk, and what types of AI are low risk.
  2. A self-regulatory environment coupled by government enforcement when the self-regulatory frameworks fail.
  3. The sectoral approach, currently prevalent in the United States with different government entities issuing rules and statements about AI in the sectors that they administer

One strategy companies are using to deal with the “sectoral approach” is looking at which regime to follow. “What I’m starting to see is that it’s just easier for companies to comply with the strictest regime.”

In some ways, the EU has emerged as a global regulator of artificial intelligence, at least with the initial steps they’ve taken. And unless companies are willing to build regional AI tools that comply with the local regulations (which would be enormously costly and difficult to manage) it would mean that standards around [prohibited] AI, or judgment calls around what constitutes “high risk,” would be adopted globally.

—Jarno Vanto, King & Spalding LLP

Of course, U.S. companies may view these EU efforts as mandates on U.S. companies, given the fact that most of these AI tools are currently being developed in the United States.

How to approach artificial intelligence (AI) governance?

Our approach – even in the absence of regulation – is to start with your values and your principles. I think that’s the only way to design a governance program around AI, because so much of it has to be values led.

—Christina Montgomery, IBM

“The core of internal governance over AI regardless of the regulatory environment is the ethics and principles that govern development,” offers Matheson. What the law says matters of course, but certainly there are broader ethical considerations “like what do we want the company to stand for?”

These considerations are not solely tied “to how algorithms make use of data, but more broadly with use of data generally,” opines Vanto. “When developing these tools, the first thing companies should keep an eye out for is the purpose for which the tools will be used.”

Many companies have implemented data use-case ethics boards and similar bodies to contemplate this and will say no to the potential monetary gains if they view the use-case as unethical or inconsistent with their approach to use the personal information.

—Jarno Vanto, King & Spalding LLP

“With AI, it is the very same assessment, continues Vanto. “Is the purpose consistent with the values of your company?”

Civil rights advocacy and artificial intelligence (AI)

You can’t just rely on civil society groups to do the work that companies should be doing themselves. Ideally you would want civil rights feedback to come from inside the company…so that when these products and services are presented to advocacy organizations, the thoughts and considerations of those communities have already been thought through.

—Bertram Lee, Future of Privacy Forum

“From a policy perspective,” continues Lee, “one thing that might be helpful to think about with respect to the civil and human rights community is that Civil Rights law has prohibited discrimination in a variety of contexts, and in a variety of ways for the better part of 60 years.

“That context is important because when I hear from companies that the law isn’t clear [the question then becomes] how are you compliant in spirit? What is your best effort…with respect to non-discrimination?”

There is coming regulation on privacy. No doubt there is going to be some form of algorithmic mandate or accountability from Colorado or California, maybe even Virginia…The algorithmic biases issue is on its way.

For everyone involved, it makes the most sense to think about how to test for nondiscrimination. How your data sets are discriminatory, and how you’re fighting against that actively. What are the ways in which this AI could be used that could be discriminatory?

—Bertram Lee, Future of Privacy Forum

“All should be asked and answered before even thinking about deployment and there should be a clear reasoning behind it,” assets Lee. “The recent settlement with HUD is an example and folks are slowly waking up to their liabilities in this space.”

Managing artificial intelligence (AI) principals at scale

“How do we affect real transparency for a complex algorithmic system?” asks Matheson. “How do we regulate data quality for training data sets that have billions of data points – especially on the scale of IBM?”

“We have to make it easy for our clients,” says Montgomery. “That the tools we deploy are giving them the capabilities they need to be confident that the uses they’re putting AI to are fair, transparent, explainable, and nondiscriminatory.

The IBM Ethics Board and the Project Office that supports it is within my purview of responsibility. We designed our program to be very much top-down bottom-up because we wanted the tone from the top…helping set the risk for the company, instill its importance, and hold us accountable…But importantly, also ensure that multiple voices are heard. That we’re incorporating the voices of marginalized communities as well.”

Montgomery further notes that IBM has approximately 250,000 employees globally and has created a network of multidisciplinary “focal points” throughout the company that comprises both formal roles and an advocacy network of volunteers to support this effort. The result is a culture of trustworthiness.

The how is where it becomes really tricky. It’s one thing to have principles. It’s one thing to have a governance process which is really central to holding ourselves accountable and helping to operationalize those principles. But we have to tell people how.

—Christina Montgomery, IBM

IBM has a living document called Tech Ethics by Design explains Montgomery. It walks designers and developers through the questions they should be asking and through the tools they should be using to test for bias. And it gives them data sheets to document the data being used throughout every stage of the lifecycle.

But IBM doesn’t go it alone says Montgomery. IBM also collaborates with external organizations like the open-source community and is currently funding a lab at the University of Notre Dame.

Will regulation help with the issue of artificial intelligence (AI) bias?

“We often see – whether it’s a prescriptive regulation or voluntary self-regulatory framework, or even just a statement of principles – people get lost in the weeds of how to do the compliance,” says Matheson.

“I would love to see a co-regulatory approach, says Montgomery, “and IBM has been calling for precision regulation of artificial intelligence to regulate the high-risk uses. Not regulate the technology itself. We’re supportive of guidance, frameworks, and regulation in this space, but it’s important to have that regulation be informed by what businesses are seeing and balancing innovation and risk.”

“I agree,” says Vanto, “actual business practices should be factored in so regulatory work doesn’t happen in a vacuum. But it’s interesting, if you’re looking for example, the list of the high-risk and prohibited systems, they’re value-based judgments.” This shouldn’t be set in stone as there a use cases that we can’t even conceive of today, and “having that in regulation that takes years to change, may be challenging.”

Instead of setting in stone what constitutes a high-risk activity now – the prescriptive approach – we should have certain criteria based on which certain systems or use cases should be considered high risk or prohibited altogether…because there might be others down the line very quickly as these things develop.

—Jarno Vanto, King & Spalding LLP

Self-regulating artificial intelligence (AI)

“While I agree that we don’t want to necessarily stifle innovation – there are ways in which these technologies could use to benefit all of society – we have to understand that the data sets that underlie a lot of these systems are all based in discrimination,” contends Lee.

“If folks could self-regulate, we wouldn’t be having some of the same problems that we’re having right now, because there would have been someone who was in the room, saying let’s revaluate.”

As an example of internal evaluation working, Matheson notes that IBM chose not to offer API’s for facial recognition software which was publicly announced.

It comes back to the first point I made underpinning our governance framework are the values and the principles that we align ourselves to, beginning with data should augment not replace human decision making. It should be fair, transparent, explicable, privacy-preserving, secure and robust.

—Christina Montgomery, IBM

Montgomery notes that IBM had a number of proposals come forward during COVID-19 to, for example, use facial recognition for mask detection or fever detection for deployment in various airports or businesses. The concern was how guard rails could be put around the different technology types. The details of IBM’s decision making process were published by IBM in the report “Responsible Data Use During a Global Pandemic.”

“Ultimately, facial recognition (at least at the time) presented concerns regarding accuracy, fairness, and the potential for it to be used for mass surveillance or racial profiling. That coupled with the questions around the technology itself led IBM to the decision [not to deploy] facial recognition.

“We wanted to be very clear that we weren’t making different decisions, just because we were faced with this exigent circumstance. We were still relying on our governance process and still adhering to our values and our principles,” declares Montgomery.

This last point cannot be stressed enough. To jettison principals and values, even in exigent circumstances (the rallying cry of a long line of malefactors) renders the very concept of values and principals as nothing more than expediencies to be used or tossed as circumstances “require.” The very antithesis of “principals.”

Listen to the session audio

  • Company
  • Privacy Tech

WireWheel DSAR Connector for Drupal & WordPress

As new privacy laws continue to emerge in the United States, businesses are looking for increasingly straightforward ways to support consumers’ appetite for privacy. WireWheel is excited to announce the launch of our first set of WireWheel integrations for Drupal and WordPress. These connectors make it easier than ever to integrate data subject intake requests (DSAR) intake forms into your existing user privacy experience.


As DSAR requirements continue to be upheld under laws like CCPA, CPRA, and GDPR, privacy teams can anticipate similar requirements to emerge with new passing legislation.

The open-source WireWheel connectors allow organizations to seamlessly integrate WireWheel’s Privacy Studio for DSAR intake directly into their existing WordPress or Drupal websites. Setting up and launching the WireWheel DSAR Connector for a WordPress or Drupal site is simple, fast, and easy.

DSAR Connector Benefits

  1. Plug & Play Integration
    Enables a simple and easy integration with WireWheel’s DSAR management solution (Trust Access and Consent Center) to any existing Drupal and WordPress websites
  2. Fully Integrated User Experience
    Take full control over how DSAR request intake forms are displayed to users and integrated with your existing website experience.
  3. Flexibility & Control
    Provides full control and flexibility over the DSAR intake user experience on your website

Who can use the WireWheel Connector?

The WireWheel Connector can be used by anyone who has a Drupal or WordPress website and has purchased the WireWheel Trust Access & Consent product for data subject access requests (DSAR). The connector itself is free and open-source but requires a WireWheel subscription to utilize its functionality.

DSAR Connector Features

Once installed and configured onto a website, WireWheel’s DSAR Connector offers Drupal and WordPress users these key features:

  • Easy integration with WireWheel’s Trust Access and Consent Center for DSAR automation, processing, and management
  • Fully customizable intake forms to fit your needs (styling, placement, number of forms, supplemental content, etc)
  • Unlimited creation & placement of forms to handle complex flows, multi-form processes and regional intake requirements

Taking advantage of the power of the WireWheel data subject access request management is now easier than ever for customers who are using Drupal or WordPress.

  • Regulations

Colorado Attorney General Phil Weiser on Data Privacy

As we all know, Colorado is among those states leading the country in terms of thinking about consumer data and data privacy. The Colorado Privacy Act (CPA) is one of the leading laws blazing a trail for all of us around how states and how consumer data should be protected here in the U.S. As such, Colorado is one of the three main states that virtually every company in the country is trying to think about.

To provide some insight on what to expect from Colorado, the Day-One Keynote of the 2022 Summer Spokes Privacy Technology Conference featured Colorado Attorney General Phil Weiser.

The following are excerpts from Mr. Weiser’s comments. They have been lightly edited and quotation marks omitted for ease of readability.

Who is Philip J. Weiser?

I have been a student, a practitioner, a teacher, and scholar on the regulation of emerging technologies… I’ve been involved as a federal official, as a state official, as someone who has worked on issues from the public policy side. My true north is how do we best serve and protect consumers in the midst of technological change. Privacy is obviously a core part of this effort… I now serve as Colorado Attorney General.

After I clerked for a couple years, I went to work for Joel Klein, who was the head of the anti-trust division at the US Department of Justice (DOJ) at the time. This was at the dawn of the Internet age. It was 1996, the telecommunications act of 1996; efforts to allow commercialization of Internet technologies; and the Microsoft case involving the browser wars; and the advent of broadband.

It is in this era I was involved as a federal official and as a state official. As someone who has worked on these issues from-to the public policy side – and my true north is how do we best serve and protect consumers in the midst of this technological change – privacy is obviously a core part of this effort.

In 2009 I rejoined the federal government after a decade in academia, so I worked for Joel Klein for a couple of years and then was in the telecommunications program at the University of Colorado, whose law school founded a Center of Law Technology Entrepreneurship known as the Silicon Flatiron Center; worked as a head of the Colorado Innovation Council; worked on Obama’s transition for the Federal Trade Commission (FTC); then went to work for the DOJ; then the White House to work on issues around technology competition, innovation; and afterwards, back to Colorado where I served as  the dean of the law school for five years.

I now serve as Colorado Attorney General.

How the rubber meets the road

Technology is a core part of what I focused on, and when I ran for Attorney General the consumer protection mission and the impact of changing technology was on my mind.

As soon as I got in, we had this question about Colorado passing its own data privacy law. I’ve referred to this as a ‘second best solution.’ In the best of all worlds Congress would adopt a federal privacy law. I had worked on an effort that many – Danny Weitzner most notably – helped champion in the Obama White House: A privacy Bill of Rights concept.

The tools we have now include data security laws in Colorado which includes data breaches as well as requirements for companies to take reasonable precautions to make sure your data is protected. And now this data privacy law that we’re in the midst of implementing. We’re currently in a consultative period. We will have some public sessions coming up, and then we’re going to put out a formal rulemaking this fall.

How do we ensure the rubber hits the road in the right way and we’re actually protecting consumers?

Part of the core effort is to make sure consumers know what their rights are, have a sense of when they’re not being honored, and can let us know. How do we tell businesses what their obligations are, what they need to do, and create enough space for that compliance to happen?

This is a complicated puzzle…it’s going to take time.

We want to be thoughtful. We want to make sure that we’re focusing on what really matters. And we will make sure that we’re not being overly prescriptive in assuming there’s only one way to do something.

What can companies anticipate from the Colorado Privacy Act?

We need to allow a period for compliance and make sure we give businesses the information and tools to get into compliance and we’re not going to play games or gotcha…We’re really trying to get it right. We want to work with you.

Stage One: Over the course of the summer months, it is a more informal engagement process. If you are an actor in this ecosystem and you’re asking, what do I need to know or how do I get engaged, we have set-up a website with a comment forum to give you more visibility and a chance to be heard.

Stage Two: We will put out a call for comments with specifics for people to comment against. For example, a universal opt-out mechanism is something that’s going to get a lot of attention that we’re going to be wrestling with. Another one we want to hear from people about is the concept of  ‘dark patterns.’

This is ‘the fall process.’ The law calls on us to complete a rulemaking by May of 2023 and our vision is to finish it well in advance of that.

Stage Three: There will be an implementation period. We know that compliance is not going to necessarily happen immediately. We need to allow a period for compliance and make sure we give businesses the information and tools needed to get into compliance.

I’m interested in making sure that the enforcement we do is towards those bad actors who are willfully non-complying. The people were really trying to get it right, we want to work with you. We’re not going to play games or gotcha.

Harmonizing the States Privacy Laws

I have a lot of thoughts on harmonizing with the other states that are thinking about this. The key concept is interoperability or compatibility. If our law is interoperable and compatible with California’s, and we’ve given people tools so they can readily comply with both, then we have succeeded.

If, by contrast, our law is incompatible with some other states laws…Then we have made life impossible for companies who can’t comply with one or the other, but not both. (An extreme example would be a specific form of technology to implement certain requirements.)

It’s on us to make sure that we work with our fellow states. That we are thoughtful about how we enable sound compliance. I believe we can do that.

We need to be able to build trust and protect consumers, but also not stop development of new products that can benefit consumers and what the consumers want.

I do have a general awareness that we as enforcers need to be careful about being overly prescriptive. It’s not that I would be averse to ever seeing a need for a technological standard, but even many technological standards will leave implementation choices so that you’re not endorsing specific technologies. There’s a lot of work to be done in this area and we’d love people’s feedback on.

A federal data privacy framework is needed

What would be best is if the ideas that are getting generated through this process – the experimentation at the state level – find their way into national legislation.

There has been a cost to a lack of federal leadership in data privacy. The U.S. Government developing fair information privacy practices 1970s was the leader in data privacy. That leadership has been ceded over the last 20 years…and now we are part of an increasingly smaller list of countries that have not developed their own data privacy frameworks.

We need a federal privacy framework, and it is important that we do things based on rigor, based on careful analysis, and not be overly prescriptive from the standpoint of preventing technological development and innovation.

This brings me back to my point about the ‘second best.’ The second best we have in the U.S. is the States. But because Colorado, California, and others will have an alternative to GDPR it can enable dialogue and learning.

What would be best is if the ideas that are getting generated through this process…find their way into national legislation.

Avoid dark patterns and use privacy laws to build trust with customers

Every company knows that one of its core value propositions is trust: do customers trust you? Do your business partners trust you? Do your regulators trust you? When you engage in behavior that is unworthy of trust, that can do great damage to your brand. Think really carefully about how you approach this issue.

First advice: you’re hearing more about design thinking and user-centered design. Companies that ask ‘how does this look to the customer?’ will avoid behavior that is going to get you in trouble.

In the dark-patterns conversation, the basic point is if companies are really trying to give users awareness about their data, give them visibility on what data they have, and help them make informed choices, they are going to be more readily able to comply.

Where’s the company’s that ask the opposite question – who want to trick their users, to use data in ways that they don’t really understand and hope that they don’t notice – you’re playing a dangerous game. And it’s not only a dangerous game visa vie compliance enforcement consequences. It can do great damage to your brand.

My second piece of advice: Constant vigilance. When collecting, storing, and managing data, we’re vulnerable to all sorts of risks. No company should comfort themselves with check-box compliance. You need to develop ways in which you’re constantly vigilant and giving customers awareness because there’s a lot of room for error and mistakes.

Looking to learn more about the Colorado Privacy Act?
Contact WireWheel today and let us help you through your compliance journey.

Listen to the session audio

  • Privacy
  • Regulations

Privacy Operations in Practice: Practical Tips

I’m trying to be as proactive as possible rather than reactive. A cross-functional approach makes it possible for a small team to be proactive. For example, our dedicated customer experience team is a key stakeholder in our privacy operations. They’re the frontline of defense. The ones who are dealing with customers who may be raising privacy issues.

—Kelly Peterson Miranda, Grindr

Whether you’re a small company or large – well into your privacy journey or just setting out – establishing privacy in practice presents complex and dynamic technical and cultural challenges that are made more demanding as additional state regulations that significantly impact consent, advertising, and notice rapidly approach.

To offer practical guidance, Grindr’s director of global business and regulatory affairs, Kelly Peterson Miranda, and Melanie Ensign, CEO & Founder of Discernible sat down with WireWheel’s senior engagement manager, Sheridan Clemens, at this year’s Summer Spokes Conference to discuss Privacy Operations in Practice.

A cross-functional approach to privacy operations

“Grindr has a dedicated privacy team led by our Chief Privacy Officer (CPO),” says Miranda. A stakeholder in legal and an advisor to the privacy team as she interacts with the regulators. “But I am also focused on creating a proactive strategy for handling upcoming compliance obligations.” Grindr is a small company of about 160 people, so privacy is decidedly a cross-functional approach.

Using a cross-functional approach to implement privacy in practice makes it possible for a small team to be proactive. “For example, we have a dedicated customer experience team,” she says. “They’re the frontline of defense. The ones who are dealing with customers who may be raising privacy issues.”

We’re seeing even those companies that have dedicated privacy engineers putting a lot more resources into teaching and evangelizing so that everybody becomes a privacy engineer in some regard. Even if it’s not in your title you are working on a privacy project. You’re thinking about privacy.

—Melanie Ensign, Discernable

“What we’ve seen at Discernable – we have traditional big tech clients that have very large privacy engineering teams and clients doing things similar to what Grindr is doing, such as teaching privacy to software engineers, SRE engineers, INFR engineers, and the other technical folks that own and operate all of the systems on which we need to apply and deploy privacy controls.

That said, “we’re seeing even those companies that have dedicated privacy engineers, putting a lot more resources into teaching and evangelizing so that everybody becomes a privacy engineer in some regard. Even if it’s not in your title you are working on a privacy project. You are thinking about privacy.

Managing the tension between privacy engineering and privacy operations

We need to be realistic. That tension will probably always exist. There’s only so much bandwidth. You have the core products or services that you’re trying to deliver, and then you have the compliance obligations.

Oftentimes it comes down to people who sit outside of a legal, privacy, or the compliance function who need to know ‘What do we have to do? What are the black letter law requirements?

—Kelly Peterson Miranda, Grindr

“The message we need to get across internally – and this is a long game – is that we are at a point, right now (especially domestically for those in the U.S.) that solely focusing on black letter law compliance obligations is only going to put you in debt for the long term: you’re always going to be playing catch-up and you’re always going to be playing a high-risk game,”

“You have to meet the engineers where they’re at and explain, yes, the law says X and my advice to you is that we need to do X+Y to enhance it because we are essentially future-proofing,” urges Miranda. “It will pay dividends in the long run. And we, as compliance professionals, have to do a better job of storytelling about the why behind the work we’re doing rather than simply stating, ‘it’s the law.’”

Organizational benefits of privacy operations

There is no company in the world that’s going to get credit for just operating at the legal minimum. You do not build a reputation and you do not build benefit of the doubt by constantly hitting bare minimum.

—Melanie Ensign, Discernable

“And privacy is not solely a legal decision,” continues Ensign. “There’s other types of risk that are involved, and sometimes, those other types of risk may be more compelling to the business than the legal risk.”

“In a communications role, I’m spending all day worrying about reputation and public perception. Prior to founding Discernable, I was leading security, privacy, and engineering communications at Uber where we viewed legal requirements as the floor, not the ceiling.”

“You need to bring your cross-functional partners together to talk about the different types of risks that exist and what future proofing looks like for the organization,” suggests Ensign. “Then you can go to the business and say, ‘Here is the legal risk. Here’s the reputation risk. The financial, market, and competitive risks.'”

“But nobody wants to be at a disadvantage, and it seems everybody’s waiting around not advancing privacy because they worry if they’re not exploiting people on the marketing side, their competitors still are,” observes Ensign.

Approaching risk cross-functionally, you can go to leadership “with a 360-degree view of the risk and present recommendations to protect the business for the long term. And Kelly’s a hundred percent right. It’s about the long game and not giving the business whiplash every six months when a new privacy law comes into effect.” And as Miranda points out, “reputational risk can speak loudest to the business and especially to the C-Suite.”

“Everyone in the world sends a message to their customers saying, ‘we take privacy seriously,’” continues Miranda, “so you need to double check with your comms and marketing team about what statements were made in the past and if the move we’re getting ready to make is antithetical to that? If it is,, your competitors will call you out on it.”

“The question is not just are we breaking any laws, but also, are we breaking customer trust?”

“The other function that I recommend that folks check in with is your sales teams,” suggests Ensign, “which also provides an opportunity to communicate how privacy investments are directly impacting the bottom line.”

Getting budget and scaling up

With CPRA going into effect in mere months, and Colorado and Virginia following, privacy teams will need both budget and scale to cope. Doing this, particularly in resource-constrained environments can be difficult. So what the approach?

Teach existing functions how to improve their own workflows rather than trying to build a separate privacy silo that’s not part of anybody else’s existing performance reviews or performance ladder.

—Melanie Ensign, Discernable

“If you’re on a small team, don’t try to create everything from scratch,” continues Ensign. “In my experience, investment in time and energy is better spent building relationships.” Get support from engineering, marketing, operations, et al. “You’ll get more bang for your buck. When you’re small, you can’t do it by yourself. And when you’re big, you’re just wasting resources.”

Technology is a critical factor as well. “Less reliance on manual processes, and more reliance on automation, that’s either built in house or through a third party is key,” stresses Miranda. “If you’re manually fulfilling your compliance obligations, that’s good intentions, not best practice. Process and tools are the way to scale your privacy operations. Understand the tipping point of when a centralized privacy function may not work anymore.”

Measuring privacy compliance success

We’re seeing a push towards transparency overall. How compliance obligations help support the bottom line and the view of privacy as an asset to, rather than a deficit to doing business.

—Kelly Peterson Miranda, Grindr

There are numerous metrics related to meeting legal requirements: the number of DSARS or number of deletion requests being processed. The timeliness of the response. How many high-risk processing activities are undertaken and for which you’ve put in sufficient controls. Number of DPAs. How many privacy incidents and their time to resolution.

Beyond that, there are a myriad of additional and equally important such as training notes Miranda, such as how much, when, and what kind of training are you offering? Number/value increase of deals privacy helps to close and DSO rates? Is there a positive impact on reputation?

Metrics are going to mean different things to different companies, depending on the context. DSAR spikes are a negative thing if you are going through a crisis [like a cyber breach]. But being able to respond quickly [and effectively] is a win you may want to communicate publicly.

—Kelly Peterson Miranda, Grindr

Importantly, “when you’re dealing with consent, you need to look at it as a measure of the trustworthiness of your company, says Miranda. I think you also need to look at, for example, the consent rates for different things or perhaps help articles related to privacy: how many views are you getting?

“Tell a total story: the wide birth of things that a privacy function can actually do.” Some of it will be shared with C-suite, some company wide. “But also look for stories to tell publicly with metrics that help endear  trust.”

“We’re seeing a shift to where customers are demanding more information about privacy and security as they evaluate whether they’re going to do business with an entity,” says Miranda. “This means marketing and sales needs to know more so they have the privacy fluency needed to close the deal.”

In short, privacy needs to be a compelling story internally (told in the context of what is important to your partners in sales, marketing, engineering, government affairs, the C-suite) and externally to customers and regulators.

Compelling stories need good storytellers.

Listen to the session audio

  • Privacy
  • Regulations

Consent and Advertising in 2023

Looking back, the Global Data Privacy Regulation (GDPR) really set the bar for notice, choice, and consent and unbeknownst at the time, it gave us a look into the future of how privacy legislation would evolve. Years later, California provided the first interpretation of what privacy law was going to look like in the United States at the state level. Evolving from the California Consumer Privacy Act (CCPA) to the California Privacy Rights Act (CPRA), now four States have followed: Virginia, Colorado, Connecticut, and Utah, with several others in discussion.

Today, organizations are mobilizing to devise consent management and notice strategies – the common themes across all the legislation – across multiple channels, brands, and devices from phones to smart TVs and connected appliances.

Joining WireWheel CPO Rick Buck at the SPOKES Privacy Technology Conference (held June 22-23) to discuss Consent and Advertising in 2023 are Jennifer Harkins Garone, Sr. Director, Privacy at Carnival Corp.; IAB and IAB Tech Lab EVP and General Counsel, Michael Hahn; and Gary Kibel, a Davis + Gilbert LLP Partner.

This seasoned group of privacy experts have seen the concepts of notice, choice, and consent go from non-existent to becoming a front and center issue.

A lot to unpack

With all these new laws, all the different state laws, and the lack of a federal law in the U.S., it is very challenging because definitions in the laws do not line up and obligations in the laws do not line up. This leads to the big question: what sort of solutions should you implement?

—Gary Kibel, Davis + Gilbert LLP

Table showing the Consent and Advertising laws in 2023 regarding opt in and opt out

“Do you implement a state-by-state solution, or a one-size-fits-all solution based on the strictest standards brought together from multiple jurisdictions?” asks Kibel. “You can’t simply say I’m going to follow the one strictest state, because there are unique differences.”

“Some of the laws have definitions of sensitive personal information (PI), which do not line up exactly the same,” notes Kibel. “The most impactful is that Virginia, Colorado, and Connecticut require an opt-in to process ‘sensitive personal information,’ California’s CPRA and Utah require an opt-out.”

Importantly, most of the sensitive personal information definitions include precise geolocation. While Apple now requires apps to ask permission to collect your location information, “this is not a common practice on the web where IP addresses and precise GEO information is collected without an express opt-in. This is one of the unique things that’s going to change in 2023,” advises Kibel.

This requires some big decisions: as it is not a requirement in every state, will it affect a pop-up for Virginia, Colorado, and Connecticut only, or is it going to do it for everyone regardless?

The weeds of the law

Consider just a small sampling of consumer rights across the States, the complexity of implementing consent in terms of policy and technical implementation becomes clear:

  • The privacy laws have opt-out rights for targeted advertising, but the states define them differently (California calls it cross-context behavioral advertising).
  • The California Privacy Rights Act (CPRA) extends the California Consumer Privacy Act’s (CCPA) right to opt-out of sale to include sharing and limiting the use and disclosure of sensitive PI. “The CPRA requires there to be a new link which has the words ‘limit the use of my sensitive personal information,’” notes Kibel.
  • Other laws take a similar approach with a right to opt out of targeted advertising, sale, and profiling, with some requiring opt-in for use of sensitive personal information.

Once we dig into the weeds of the law, there are other disconnects even between the same concepts. The CPRA treats sale of data to a third party for monetary or other valuable consideration, while in Virginia it is only ‘monetary consideration’, which again necessitates deciding “how to apply this right differently to consumers in different locations.

—Gary Kibel, Davis + Gilbert LLP

It is a lot to unpack. To help navigate the ever-changing privacy law landscape, WireWheel has created a Privacy Law Comparison Matrix.

The network-based approach to consent

“Do you need to obtain consent on the publisher or advertiser’s page? Or can you obtain consent on a single page in a broader network that bands together,” asks IAB’s Michael Hahn.

To look at an issue like this, we have to start with ‘what’s the basic standard?’ And while there are undoubtedly nuances in what it means to consent, for those thinking about this in a multi-state approach, you’re going to end up with what is typically the most rigorous version of consent to apply it everywhere.

—Michael Hahn, IAB

That most “rigorous version of consent” is the GDPR version, says Hahn, and one which also appears in some of the State laws. The CPRA adds that “a business must adhere to the following principles when designing its consent method, and any method that fails to meet these requirements may be considered a dark pattern and does not constitute valid consent” (Cal. Proposed Regs. § 7004(b)). The principals include:

  • Easy to understand
  • Symmetry in choice
  • Avoid language of interactive elements that are confusing, and
  • Avoid manipulative language or choice architectures

Infographic showing to define Consent and Advertising in 2023

Does a network-based approach to consent work?

“When referring to a network-based opt-in approach”, says Hahn, “what we’re really talking about is providing consent to a large number of, let’s say publishers and ad tech companies to undertake cross-site tracking. In other words, you are being asked to opt-in to cross-site tracking for the network participants. However, when tested against the CPRA consent standard this network-based approach falls short.”

It’s tough to imagine making a strong argument to a regulator that when I go to publisher number one as a consumer, that I was sufficiently informed about what could be a large number of other publishers in the network…that providing this bulk consent is for a narrowly defined particular purpose.

—Michael Hahn, IAB

And indeed, the CPRA states that any opt-in link applies only to the business with which the consumer intends to interact (Cal. Civ. Code § 1798.185(a)(20)).

While the concept of multiple independent or joined controllers exist in state law, “generally speaking, state laws encumber the entity with whom the consumer has a direct relationship with a broad set of direct responsibilities and distinguishes them from third parties: a concept that that does not exist in Europe,” notes Hahn.

Hahn also notes that the draft regulations have an entirely new concept: “The business needs to either a) disclose the third parties to whom they have sold personal information and such third parties control the collection of the information, or b) provide information about their business practices.”

“I don’t know what the second half of that means,” confesses Hahn, “but whatever it does mean it suggests to me that it’s impossible to fulfill that requirement” in a network-based approach.

Operationalizing privacy law requirements

When asked if the ‘do not share’ prohibition under CPRA is tantamount to the right to opt out of behavioral targeted advertising, two or three years ago, most of us would have said no, they’re different. But the state AG has started to say they’re the same because with a lot of the behavioral and targeted advertising that is done through cookies (such as with Facebook) somebody is exchanging money in order to get that.

—Jennifer Harkins Garone, Carnival Corp.

The question becomes “If my company website has a Facebook cookie who is then selling that information, how does somebody who is running a program handle it?”

For Garone, the answer is “you have to apply the ePrivacy Directive to everybody. Or apply California to everybody. And it is going to be available on January 1, 2023, as opposed to 2025, because if you iterate, it costs a lot of money. Heck, if you don’t iterate it can cost a lot of money, so it is a very challenging decision.”

You have to look to technology. “In one part of our business we have a homegrown tool and we started to find it costs too much in money and lead-time to make the necessary changes with the constant parade of new laws. So, we’re looking at what the right technology stack is for us to manage it,” says Garone.

If the cookie banner is your opt-out vehicle, how do you make that work together with do not share? How do you bring do not share into your cookie and tags? It’s taking us a while to figure out because of all the nuances.

One of the ways we can make it easier on ourselves is getting the right technology.

There are still a lot of questions around the new privacy laws

You have senior executives who have goals to meet. They want to do targeted advertising. You need to have conversations with them and there are a lot of questions to ask, says Garone:

Who are you sharing data with? Under what definition? What is the agreement that you have if somebody says, I don’t want to process sensitive information? What are the contracts with third parties telling you? Are you making so much on targeted advertising that it is worth a potential fine? Can you cure in 30 days if necessary and what would be your plan to cure? Do you even have an internal process to get those regulatory letters to the right people?

In the end, operationalizing privacy, whether it is multi-state, single state, or globally, comes down to the basics. Questions need to be answered, competing requirements resolved, and decisions made.

“I was looking at an agreement with a third party,” relates Garone, “that we’re buying information from for prospecting. Going through the list of data elements…I found one was latitude and longitude. Why are we asking for precise geolocation? What are you going to do with that that you are not already getting? You’re going to have to protect it like you protect a credit card number. So it was struck.”

There are a lot of new common-denominator obligations:

Notice at point of collection. Updating privacy notices. Third-party due diligence for those with whom you share data. Contracting those third parties so they uphold your data the same way you’re obligated to. The ability to provide and honor rights – not only effectuate those rights and collect consent – but to pass those signals throughout the ecosystem.

All that is really hard to do.

—Rick Buck, WireWheel

Listen to the session audio

  • Marketing
  • Privacy

Preference Management and Customer Experience

For any organization operating in the current data privacy climate, managing consumer consent is just one piece of the puzzle. In order to maximize transparency and build consumer trust, it is critical to consider how to manage and adhere to the preferences that consumers have communicated.

Companies impacted by the emergence of recent data privacy laws understand that a proactive approach to transparency can be beneficial, since building consumer relationships on trust drives customer loyalty and impacts buying decisions. Additionally, companies today are incorporating personalization into their marketing tactics to maximize consumer engagement and support sales. Teams that are unable to embrace preference management in their customer experience strategy risk losing market share to competitors that do.

Introducing Customer Experience

What is customer experience?

Customer experience is how customers view their direct and indirect interactions with companies. Direct interactions include anything customers actively do to purchase and use a product or service. Indirect interactions are unintentional encounters with companies. For instance, seeing an ad for a product before playing a YouTube video is an indirect interaction.

Everything companies offer—advertising, products, customer service, and more – contribute to the customer experience.

Why is customer experience important?

Organizations care about customer experience because it can be the “X factor” that drives tremendous revenue and growth potential. A recent Zippia survey found that companies that provide a superior customer experience can find their revenue increase by up to 15%.

Top Preference Management Factors that Influence Customer Experience


It is critical to create streamlined preference management experiences that allow consumers to quickly and easily find and manage their preferences. Complex user experiences can result in user frustration and abandonment. Over time, negative user experiences can turn into negative brand sentiment. Eliminating high-friction scenarios can improve the overall user experience.


Continued advancements in technology have allowed consumers to grow accustomed to having options. One of those options includes communication preferences. It is important to understand where, when, and how often your consumers want to hear from your brand.

If a consumer indicates that they prefer communicating through text message but your brand continues sending unwanted emails, your organization is wasting resources and alienating potential or existing customers. When consumer preferences are effectively managed and respected, organizations have the ability to maximize engagement with their audience. In turn, satisfied consumers may be more likely to buy.

Response Time & Consistency

Users expect to see digital updates reflected immediately. In order to ensure that communication preference updates can be made in real-time, a system must be in place to automatically store, update, and delete data. When a consumer changes their preferences, a preference management system should be able to accurately reflect those updates across the entire organization’s preference management system. Inconsistencies and lag time can lead to negative user experiences.


Preference management is something that all brands should seriously consider if they want to increase positive engagement with consumers. Managing consumer consents and preferences is difficult, but it doesn’t have to be as daunting a task as it may seem. To tackle this challenge, many organizations use a consent and preference management platform to help with both consent management and preference management.

Looking for a solution to help manage consumer consent and preferences? Schedule a demo to learn more about WireWheel’s Trust Access and Consent Center.

  • Privacy

Multi-State Legislation Operational Readiness

I lean to the side of operations because the relationship between the privacy office and the business is critical. We look at these laws and we want to ensure that we can operationalize them for the business.

—Lisa Barksdale, Zillow

Beginning January 2023, three comprehensive state laws go into effect: California (expanding on previous privacy regulation), Colorado, and Virginia. And right behind that, Connecticut, and Utah.

The plenary session of the SPOKES Privacy Technology Conference (held June 22-23) – Multi-State Legislation: An Operational Readiness Discussion – brings together expert privacy practitioners who are experienced in crossing the legal, business, and technology divides necessary to translate what the law says into privacy programs that work in practice.

Lisa Barksdale, Director of Privacy at Zillow, Tara Jones, Yahoo! Legal Services Senior Manager, Global Privacy, and Katie Pimental, AGC, Global Privacy, Yahoo! joined WireWheel Founder and CEO Justin Antonipillai for this widely requested discussion.

The Challenge of Multichannel Consent 

It is necessary to think about a central source of truth and the ability to update downstream to your critical systems to understand what [consent] status is at any given time so that you can market in an ethical and legal way. A complicated set of challenges.

—Justin Antonipillai, WireWheel

The three major laws and most of the follow-on laws coming have two critical components that are the highest priority: 1) All of the new legal choices that brands and publishers have to make available to consumers across every channel and 2) the state law requirements concerning privacy risk assessments and filing requirements.

While the initial focus of privacy regulation emanating from Europe was cookie consent (resulting in the proliferation of website banners), in the years that followed, California required companies that sold data to provide a clear opt-out choice. And technically, you couldn’t finish that job just in the browser. The opt-out signal must update your databases.

Now in California Colorado and Virginia, you have a whole slew of opt-out choices including, for example, targeted advertising and under Colorado law, profile creation. Adding to the complexity is the proliferation of IOT advertising channels such as Smart TVs and connected cars further necessitating a “central source of truth” to understand consumer consent status at any given time across all channels.

Privacy assessments, while not new for many, now include California’s requirement to actually file assessments to the CPRA on a regular basis. And recent proposed California regulation includes assessing third parties with whom you share data.

Ultimately, it comes down to the choices and legal frameworks that have to be put in place, and how you think about the critical assessments and privacy systems you need to have.

Solving the Challenge of Multi-Channel Consent 

Leveraging GDPR Experience

You’re going to have to understand what information you are collecting, processing, and storing (and where and how you are storing it) to ensure your mechanisms for consent, notice, and transparency actually reflect what your systems are doing.

—Katie Pimental, Yahoo!

“Yahoo is very fortunate in that a lot of the framework processes and technologies that we were required to build out for GDPR, we are now able to leverage for what we’re seeing come down in CPRA and other States when it comes to consent,” says Pimental.

She recommends looking at what your organization has already done. “Odds are you already have either a third-party or homegrown consent framework and mechanisms within your website for GDPR consent.

“One thing to keep in mind,” cautions Pimental, “is the notion of sensitive personal information and its nuances within each of the States. You’re going to have to…understand what information are you collecting, processing, and storing – and where and how you are storing it – to ensure that your mechanisms for consent, notice, and transparency actually reflect what your systems are doing. There’s a lot of third parties out there that can assist with that.”

The U.S. has always traditionally been an opt-out regime where the default is always to opt-in. Now, the CPRA and the regs are bringing the opt-in concept to the U.S. for the first time in areas like online behavioral advertising.

—Katie Pimental, Yahoo!

“We didn’t have the legal or statutory obligations to provide these types of options to users,” notes Pimental. “It’s interesting because the technology hasn’t quite caught up with what the statutes are requiring today” which has resulted in some of these statutes’ start dates and requirements being pushed back. “We definitely need to keep our ears to the ground in terms of how the laws are coming online, she advises.”

Managing Privacy Policy

While it is painstaking, the most efficient way to look at what needs updating and what changes to the privacy policy need to be made is to literally go line-by-line and State-by-State: [so it is] absolutely clear and transparent and the consumer can understand how and what we’re doing with their data.

—Tara Jones, Yahoo!

Jones notes that “unlike other pieces of the various regulations coming out, privacy policy notification and transparency is not a one size fits all. You can’t use “the most common denominator.”

This makes for a significant management challenge. And while it is painstaking, the most efficient way is to go line-by-line, state-by-state she offers.

“And then there is a completely separate operations team that manages the updates and sends all of it out for translation,” explains Jones. “It is painstaking, but this ensures that it is absolutely clear and transparent. This is not just a one- or two-person job, the whole team is involved in what is a multi-level process”.

Managing Consent

We need to just start looking at where we can be at the top of the funnel from a consent perspective – how many clicks does that represent? At what points do we need to add additional consents? And not just plug the holes.

—Lisa Barksdale, Zillow

“It’s challenging because there are so many different pathways for the user experience,” continues Barksdale. “You always want to think about the impact to the user…about how we achieve a more centralized way of establishing preferences and consents, while avoiding what could be perceived as dark patterns.”

She notes that the consumer is smarter today and “having consent choices at every point of data capture is becoming a nuisance to them. To counter this, we need to just start looking at where we can be at the top of the funnel from a consent perspective. How many clicks does that represent and at what points do we need to continue to add consent capture?

“If we just look to plug the holes, it’s not going to be good for the consumer and it’s definitely not going to be good for adoption rates – particularly as additional regulations come along.

From opt-out to opt-in? 

What really caught my eye, and I’m not the only one, is that draft regulations in California have language about reasonable and proportional use of data, consistent with the perspective of the consumer. If it isn’t, the proposed language suggests that in those areas you might have to enable somebody to opt-in instead of opting out.

—Justin Antonipillai, WireWheel

As consumers are indeed much savvier, they now have “expected uses” of their data. “The question then becomes, what’s expected?” opines Pimental. “Is a free website expected to have advertising? Is advertising expected to result in the sale and share your data?

We are really on the cusp of a fundamental shift in how we have to notify consumers to get consent.”

Interestingly, Pimental, proposes that when considering what is reasonable and proportional, the GDPR provides a solid framework emanating from legitimate interest or contractual requirements. This, she suggests, may be a helpful perspective as a baseline for what may be “potentially reasonable and proportional within the new state laws.”

Ultimately, the evolution of consent and privacy assessment requirements is a complex set of legal and technological challenges. And as Barksdale suggests, “traditional concepts like ‘know your customer’ (KYC) are valuable. The more you know your customers, the easier decisions concerning navigation of consent will become.” And as Pimental notes, with regard to the technology implementations, there are many third parties available to help.

Key Takeaways

  1. Think about a central source of truth (your databases and systems) to capture and understand what the consent status is at any given time.
  2. Leverage what you may have already done under GDPR and compare that to what information and obligations are required under the state laws and ensure you are capturing that consent from an operations perspective.
  3. With privacy policies, there is no “most common denominator.” It’s painstaking to update privacy policies line-by-line for each State, but necessary if you want to ensure clarity and transparency for your consumer.
  4. Consider a top-of-the-funnel perspective for consent. How many clicks does that represent? At what points does consent capture need to be added? Don’t just look to plug the holes.
  5. Have a program centered around know-your-customer (KYC) and consent navigation will become easier.
  6. For those doing business in the EU, consider the GDPR framework around legitimate interest and contractual requirements as an internal measure when baselining what is reasonable and proportional.

Listen to the session audio