SPOKES Virtual Privacy Conference Winter 2022

Register for Free

Blog

  • Privacy Law Update

Privacy Law Update: September 12, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


Newsworthy Updates

FTC Privacy Rulemaking Forum Brings Industry, Advocate Views to the Forefront

The U.S. Federal Trade Commission said it would follow the letter of the law when it announced its Advance Notice of Proposed Rulemaking concerning commercial surveillance and lax data security in August, which meant a robust stakeholder consultation to come. That process began in earnest with an exchange of perspectives from relevant parties at the FTC’s virtual public forum Sept. 8.

A View From Brussels: The Latest on UK Data Protection Reform and Recommendations on the AI Act

Liz Truss has succeeded Boris Johnson as the U.K.’s next prime minister. Truss previously served as Trade and as Foreign Affairs Minister in Johnson’s government. Truss has appointed Michelle Donelan as Secretary of State for Digital, Culture, Media and Sport. In this role, Donelan and her ministerial team will oversee U.K. data protection reform initiated by DCMS a year ago. However, its fate is uncertain at the moment as further discussions were postponed.

ICO Publishes Guidance on Privacy Enhancing Technologies

The Information Commissioner’s Office (ICO) has published draft guidance on privacy-enhancing technologies (PETs) to help organizations unlock the potential of data by putting a data protection by design approach into practice. PETs are technologies that can help organizations share and use people’s data responsibly, lawfully, and securely, including by minimizing the amount of data used and by encrypting or anonymising personal information. They are already used by financial organizations when investigating money laundering, for example, and by the healthcare sector to provide better health outcomes and services to the public.

CCPA/CPRA Grace Period for HR and B2B Ends Jan. 1

On Aug. 31, hopes were dashed when the California legislative session ended without enacting Assembly Bill 1102. The bill would have extended grace periods for certain business-to-business and human resources personal information under the California Consumer Privacy Act as amended by the California Privacy Rights Act. CCPA/CPRA will become fully operational on Jan. 1, 2023, for B2B and HR personal information and will be subject to the same rigorous California privacy regulations as “consumer” personal information.

Privacy Legislation

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy

6 Tips to Draft And Incorporate a Privacy Policy For Your Business

If you collect customer personal data, then you need to protect that data. No matter the field you’re in, or the size of your business, once they hand it over it’s your responsibility. And how you deal with it affects the trust your customers will have in you.

 

Chart on consumer trust for companies that have their personal data
Image sourced from mckinsey.com

 

Whether you’re gathering basic info such as name and address or more sensitive details like banking or credit card information, you need to be able to disclose what you do with it and demonstrate that you have taken steps to protect it.

The first step to this is creating a privacy policy. That can take different forms according to your business type and the information you store, but at its heart it demonstrates to your customers that their data is safe in your hands.

Wondering where to start? Just as you might use a letter of intent template, think of these tips as a way to build a template for your privacy policy.

What is a privacy policy?

As the term suggests, a privacy policy is a written policy that lets your customers know how you collect their data, what that data is, what it is used for, who it will be shared with, what their rights are, and how you will both store it and protect it. It should always be included on your website and at the point of collection on Apps and may be labeled as a privacy notice or in a section called simply ‘privacy’.

One thing to note is that there are now many regulations and laws that govern how you protect data, including laws such as HIPAA in the US or GDPR in the EU. Make sure you’re aware of any required compliance in any region you operate in, not just where your head office is based.

Do you really need a privacy policy?

a chart showing concern over data privacy in online ad targeting
Image sourced from marketingcharts.com

 

In nearly every case, yes. If you collect any sort of personal data from your customers, then you not only need an easy to understand privacy policy, clearly displayed so that customers can find it. Ensure that it is updated regularly and rigorously enforced.

The reality is that many people may not read your privacy policy in full. However, knowing you have one and knowing there are privacy laws and regulations governing how you handle data is usually enough to satisfy most people. That doesn’t mean you should ‘skimp’ on details when publishing such a policy; it should be clearly written and in language people can understand. Remember, if you say it in your privacy policy you must do it in practice.

6 top tips to drafting a privacy policy

While some companies’ privacy policies may differ slightly and contain specialized clauses, there are general commonalities that most companies should work from. You wouldn’t draft a shareholder agreement without using something like a PandaDoc shareholders’ agreement template, so don’t start from scratch here either. Instead, make sure to incorporate the following:

1. Introduction

This is a section every privacy policy should have. You want to inform customers who you are as a company and why you need this privacy policy. It should be a fairly short section but should include the following info:

  • Your company name and contact information
  • Any laws and regulations (and the applicable regions) that your policy complies with
  • Glossary of any main terms used such as ‘personal data’
  • Who the policy applies to, who ‘we’ and ‘you’ refers to, and identification of any third parties that are included in your policy.
  • When the policy was last updated

2. Data information

This is perhaps the most crucial part of your privacy policy. It should include:

  • What data you collect
  • How you collect it
  • How you will use it
  • If you share or sell that data
  • How you will store it (for example, do you use data integration software?) and protect it
  • When you may transfer that data to others.
  • How to exercise your privacy rights
  • Data retention practices
  • Where applicable how data is transferred outside of the EEU
  • Where applicable how children’s data is used
  • How updates to the policy are communicated
  • Where applicable State specific language (i.e. CPRA)

Customers will focus on this part as it gives details on how you use their personal data. To avoid confusion, you should split this part into sections that deal with different aspects of how you collect, store, and use data.

Here is a sample privacy policy from Apple:

Example of Apple's privacy policy

2a. Data collection

This section lets the customer know what data you collect and where you collect it from. The latter part can include webforms or from the checkout process. Some examples of the data you might collect includes:

  • Name, email address, and physical address
  • Phone number
  • Age and sex or gender.
  • Nationality and race
  • Login information
  • Financial info such as credit card details
  • IP address and browser or device type

What you also have to consider is that some of the data you collect may come from third party services such as Google Analytics. When that is the case, you should advise customers and direct them to that third party’s own privacy policies as well.

2b. Data use

Your next section should inform people as to how you will use their data. This may be a requirement under certain laws but it is something you should be telling people anyway. Of course, there are many different ways you might use a customer’s info but some of the most common are:

  • Security and identity verification
  • To target advertising according to tastes and previous behavior
  • Sending relevant and personalized marketing
  • Customer service and/or tech support reasons
  • Delivery of products and/or services

2c. Data sharing

You may be sharing some customer data with third parties or partners. If this is the case, then you need to let the customer know who you might share with, how it will be shared, and why you are sharing it. This may have already been covered under the ‘third party’ section of your introduction. You should also advise here when you have to share info with government departments or similar.

2d. Data sales

Thankfully, this is mostly a thing of the past and in most locations, you can’t sell on any customer data. However, some areas – such as California – still allow the selling of such info so you should advise people of this, who it might be sold to and, most importantly, give them the choice of opting out of their data being sold. If you have no intention of selling their data, make that clear.

3. Data retention and deletion

People also want to know how long you plan on keeping their data in your system. So, you should advise them if you have set time limits on data retention or whether there are legal limits on how long you can keep it. You should also include what happens at the end of the process; will their info be completely deleted or will it be anonymized.

4.  Children

Parents (rightfully!) worry whether data will be collected on their children. The legal definition of children can vary from area to area, so be guided by the relevant legislation. Most businesses will not collect information about children so you should make that clear and have a disclaimer included in your policy.

5. Personal rights

This is another factor that differs from region to region so be sure you are aware of the different laws and regulations that apply where you operate. Let your customers know what rights they have in the area they live in and how they can apply those rights if they want to see what info you hold on them.

If you want them to give up certain rights, for instance, via an NDA, it’s worth looking if you can find a non-disclosure agreement template available for free. That way, you have somewhere to start from.

6. Changes and complaints

Sometimes, changes are unavoidable. New data privacy regulations may be introduced, or existing ones may be updated. Indeed, you may decide to use customer data in a different way. It’s important to include how you will update people in that scenario. It’s also crucial that you highlight your complaints process and include all relevant contact information.

The takeaway

Free to use image sourced from Pixabay

Every business has multiple things to consider, from inventory management development to automation of email marketing. However, no matter what type of business you operate, some form of privacy policy is essential.

The important thing to remember is that you may have to comply with many different laws and regulations according to the territories you operate in. This means that while you may have a general policy on privacy, some parts of that policy have to acknowledge those differences and inform your customers what they are.


Yauhen is the Director of Demand Generation at PandaDoc. He’s been a marketer for 10+ years, and for the last five years, he’s been entirely focused on the electronic signature, proposal, and document management markets like Pandadoc sponsorship proposal template. Yauhen has experience speaking at niche conferences where he enjoys sharing his expertise with other curious marketers. And in his spare time, he is an avid fisherman and takes nearly 20 fishing trips every year.

  • Company
  • Privacy Tech

PIAs and Reassessments in WireWheel

What is a Privacy Impact Assessment?

A Privacy Impact Assessment (PIA) is an assessment or questionnaire that collects information on how personally identifiable information (PII) is collected, stored, protected, shared and managed.

A PIA is usually designed in a survey format and, at the very minimum, should answer the following questions:

  • What and how information is collected?
  • Why is the information collected?
  • What is the intended use of the information?
  • Who will have access to the information?
  • With whom will the information be shared?
  • What safeguards are used to protect the information?
  • For how long will the data be retained/stored?
  • How will the data be decommissioned and disposed of?
  • Have Rules of Behavior for administrators of the data been established?

The PIA should be completed, reviewed, and the records should be maintained for reference.

Why are Privacy Impact Assessments needed?

Privacy Impact Assessments are required under several privacy laws passed over the last 20+ years. PIAs are seeing an increase in momentum as privacy legislation has gained traction and the requirements have expanded.

  • The E-Government Act of 2002, Section 208, establishes the requirement for agencies to conduct PIAs for electronic information systems and collections.
  • The instrument for a PIA or data protection impact assessment (DPIA) was introduced with the General Data Protection Regulation (Art. 35 of the GDPR).
  • Starting in 2023, some US State Privacy laws, including laws in California, Colorado, Virginia and Connecticut, will require PIAs for vendor assessments and for high-risk data processing activities including laws

The EU’s GDPR requires a Digital Privacy Impact Assessment (DPIA) must be conducted when the processing could result in a high risk to the rights and freedoms of natural persons.

  • A DPIA is a type of risk assessment. It helps you identify and minimize risks relating to personal data processing activities. DPIAs are also sometimes known as PIAs (privacy impact assessments). We have had a few clients who conduct compact PIAs and if a high-risk system is identified then they trigger a DPIA or High-Risk Assessment.
  • The EU GDPR (General Data Protection Regulation) and DPA (Data Protection Act) 2018 require you to carry out a DPIA before certain types of processing. This ensures that you can mitigate data protection risks.
  • If processing personal information is likely to result in a high risk to data subjects’ rights and freedoms, you should carry out a DPIA.
  • For example scoring/profiling, automatic decisions which lead to legal consequences for those impacted, systematic monitoring, processing of special personal data, data that is processed on a large scale, the merging or combining of data that was gathered by various processes, data about incapacitated persons or those with limited ability to act, use of newer technologies or biometric procedures, data transfer to countries outside the EU/EEC and data processing which hinders those involved in exercising their rights. However, if several criteria are met, the risk for the data subjects is expected to be high and a data protection impact assessment is always required.

How to get started with your Privacy Impact Assessment

Many companies start out using spreadsheets as a way to collect the information required for a PIA. However, they find that it can be very difficult to track and manage these assessments without a tool.

Leveraging deep privacy expertise, WireWheel has developed a software solution to help companies manage assessment, the WireWheel Privacy Operations Management (POM) platform. The tool helps companies to easily design and conduct assessments.

Users create a template, which is a list of questions that need to be answered. The template is then used to kick off multiple assessments. Templates can be structured to include questions to understand whether or not the collection and use of personal data are in compliance with data protection regulations. This information can be mapped to asset inventories.

WireWheel has standard templates that cover key regulations and requirements and also helps to build custom templates to suit a client’s specific requirements.

The WireWheel Privacy Operations platform enables users to manually trigger assessments or for a vendor to self-initiate an assessment if required. Once an assessment is triggered, a user can assign the questions to vendors or suppliers or system owners to answer. The responses are reviewed and approved by the assessment owner and the platform ensures that the detailed assessment responses are recorded so a company can prove compliance if audited.

OR

What happens after the PIA is complete?

Once the PIA is completed and documented, a company will typically set criteria to trigger another PIA or a reassessment.

Typically this happens when any of the following activities occur:

  • Developing, or procuring any new technologies or systems that handle or collect personal information
  • Developing system revisions; when substantial changes are introduced to an existing data processing system
  • Issuing a new or updated rulemaking that affects personal information.
  • When an existing data processing system is involved in a major data breach or recurring security incidents
  • When it is according to a predetermined schedule

According to the EU’s GDPR, the reassessment process must be repeated at least every three years.

How does WireWheel help with reassessments?

WireWheel maintains a record of all completed assessments and enables customers to determine the need for reassessments using product features like reporting, tag management, or assessment details like “Last completed” date and so on. Based on certain criteria like high-risk scores, data breach alerts, or the last completed assessment, the privacy/legal team can identify the need for a reassessment and initiate it using the previously submitted assessment.

Reassessments can be triggered in the WireWheel platform by any team or individual with the appropriate permissions. Clients start with the creation of a copy of the completed assessment so that the responses submitted previously will be automatically pre-populated. The reassessment will use the latest, published version of the same template that was used to create the original assessment and use the review workflow that the original assessment used.

A completed assessment at WireWheel will include the responses submitted by the assignee, assignee(s) information, completion timestamp, and tags if any.

In WireWheel, a copy of the completed assessment can be created by navigating to “Create a Copy”:

The newly created reassessment provides the ability for the owner to assign all the questions or just the relevant questions to the assignee(s) for updates. The responses previously submitted by the assignee will be pre-populated and available for the assignee to review and edit.

Once the assignee updates and submits the reassessment, the owner reviews and approves the responses. The reassessment is then saved as a new record with the latest responses submitted by the assignee, assignee(s) information, new completion timestamp, and tags if any.

How do you compare one assessment to another?

WireWheel provides users the capability to compare the responses in an assessment using the default reporting feature. The platform allows the users to select the relevant assessments that they want to be included in the report and reports can be downloaded to the individual user’s system as well.

Summary

With more and more regulations requiring Privacy Impact Assessments, leveraging a tool like WireWheel’s Privacy Operations Manager can help companies ensure that companies are handling personal information properly.

  • Privacy Law Update

Privacy Law Update: September 6, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


Newsworthy Updates

FTC Sets Final Agenda For Sept. 8 Rulemaking Forum

The Federal Trade Commission (FTC) released the final agenda for its September 8, 2022 forum seeking public comment on the harms stemming from commercial surveillance and lax data security practices and whether new rules are needed to protect people’s privacy and information. The FTC recently announced an Advance Notice of Proposed Rulemaking (ANPR) seeking public comment as it explores possible new rules cracking down on lax data security and commercial surveillance, which is the business of collecting, analyzing, and profiting from information about people.

Why FTC Rulemaking Pales In Comparison To Proposed American Data Privacy And Protection Act

In a piece for Lawfare, Brookings Institution Tisch Distinguished Visiting Fellow Cameron Kerry discusses how proposed U.S. Federal Trade Commission privacy rulemaking is motivation, not a substitute, for passing the proposed American Data Privacy and Protection Act (ADPPA). Kerry breaks down the “long, tortuous road” that is the FTC rulemaking process while also explaining FTC commissioners “are conscious that the ADPPA addresses many of the issues raised,” with all five commissioners happy to defer to the ADPPA if U.S. Congress can finalize it before rulemaking is complete.

Working Group Takes On NIST Privacy Framework Update

In the modern history of the privacy profession, the one constant dynamic has been the rapid development of new technologies has reflexively created an industry that is in a perpetual state of innovation.  To better anticipate what role privacy will play in commerce in the near-to-long-term, the U.S. National Institute of Standards and Technology embarked on developing the new “Privacy Framework” document.

Meta’s Facebook Agrees To Settle Cambridge Analytica Data Privacy Suit

Meta’s Facebook settled a long-running lawsuit in a U.S. court seeking damages for letting third parties, including Cambridge Analytica, access the private data of users.  The preliminary settlement was disclosed in a court filing late Friday. The terms were not disclosed.  Lawyers for the plaintiffs and Facebook asked the judge to put the lawsuit on hold for 60 days to allow the parties to “finalize a written settlement agreement” and present it for preliminary approval by the court.

Is Data Localization Coming To Europe?

Two years ago, the Court of Justice of the European Union invalidated Privacy Shield, the legal framework for EU-U.S. data flows. The consequences of that ruling reinforce the EU’s digital sovereignty agenda, which increasingly sees data localization as one of its core elements.  Since the “Schrems II” judgment by the CJEU, the U.S. presidential administration and European Commission have been working on replacing the trans-Atlantic agreement with a new one that could stand judicial review before Europe’s top court. In March 2022, U.S. President Joe Biden and Commission President Ursula von der Leyen announced an agreement in principle.

Privacy Legislation

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • CCPA & CPRA
  • Regulations

The California Attorney General’s First Enforcement of CCPA

On August 24, 2022, California Attorney General Rob Bonta made his first announcement of CCPA enforcement, issuing a $1.2 million dollar settlement with online retailer Sephora, Inc. Attorney General Bonta is sending a strong message that he intends to aggressively enforce the CCPA and pending CPRA.  If you are a business that shares information with advertising networks you need to be sure you are fully compliant under California privacy laws.

Sephora allegedly violated the CCPA by failing to meet many of the key requirements including:

  • Informing consumers they sell their personal information
  • Properly honoring Global Privacy Controls (GPCs) for opt-out of sale requests
  • Disclosing their sell/share practices in their privacy policy
  • Curing the alleged violations within the 30-day period

The focus of this enforcement centered on Sephora’s sharing personal information with third-party advertising networks.  The failure to cure led to a broader investigation of Sephora’s privacy practices.

In addition to a monetary fine, the settlement also imposes injunctive obligations on Sephora including:

  • Implement and maintain a program to assess and monitor the effectiveness of  processing opt-outs within 180 days of the settlement for the following two years
  • Conduct annual reviews of their websites and mobile applications to determine which companies they sell/share data with within 180 days of the settlement for the following two years
  • Share the assessment and reviews in an annual report detailing the:
    • Testing done
    • Errors or technical problems impacting the processing opt-out requests
    • Companies they sell/share personal information with
    • Whether the companies are considered service providers
    • Purpose for which information is made available
  • Update disclosures and privacy policy to make it clear that it sells data
  • Provide mechanisms for consumer opt-outs including GPC
  • Align service provider agreements to the CCPA’s requirements

Here’s a few things to consider to make sure that you are compliant:

  • Sale:  Based on this enforcement action, leveraging cookies and tracking technologies and sharing with advertisers is likely a sale and therefore:
  • Opt-out:  You must allow consumers  to opt-out of cookies.  Make sure your website can accommodate  “Do Not Sell My Personal Information” requests and that GPC or similar technologies are enabled to handle those requests
  • Manage Consent and Preferences:  Properly collect consent and preference signals and ensure they are shared with third parties across your ecosystem
  • Privacy Rights:  Provide easy access for consumers to exercise their privacy rights
  • Contract Service Providers:  Make sure that you properly identify your service providers and update your contracts so they are compliant with CCPA and CPRA
  • Right to Cure:  Leverage this while it’s available as the cure period sunsets 1/1/23.  Sephora may have avoided this outcome had they reacted earlier.
  • Privacy Notices:  Review your privacy policies on your websites, apps and point of collection.  Align them with CCPA and CPRA requirements.  Specifically disclose “sale” of personal information for advertising and analytics purposes and how to opt-out, and any activities that may be considered a “financial incentive”

WireWheel’s software solutions help you comply with privacy regulations including managing privacy rights and consents and preferences.

  • Privacy
  • Regulations

Privacy Predictions for 2023

Yogi Berra once warned that “it’s difficult to make predictions, especially about the future.” Proving his point, Cooley Partner, Travis Leblanc confesses that at last year’s Spokes Conference he called it right just 33% of the time.

Uncowed, the closing session of the 2022 Summer Spokes Technology Conference (held June 22-23) offered some near-term privacy predictions once again. But more than just making crystal ball gazing, the Privacy Predictions for 2023 roundtable provides deep insights into the challenges (political and technological) in advancing privacy around the world.

The roundtable, hosted by WireWheel founder and CEO Justin Antonipillai Included:

  • Travis Leblanc, Cooley’s Cyber Data Privacy Practice Co-Leader who is also a member of the Privacy and Civil Liberties Oversight Board (PCLOB) that oversees the privacy and civil liberties practices of the intelligence community
  • The widely read Gabriela Zafor-Forutna, Vice President for Global Privacy at the Future of Privacy Forum (FPF), and
  • Omer Tene, Partner in Goodwin’s Cyber Security Practice who is also a senior fellow at the FPF. Tene founded the Cyber Week held in Tel Aviv.

The next 18 months around the world

“I am absolutely watching India,” says Zafor-Forutna. “For example, there is a fair chance that we will finally see the personal data protection deal pass after three-plus years of debate. This would bring more than one-billion people within the scope of personal data prediction rights.

A very interesting number that I’ve seen recently from a Gartner report was that 75% of the global population in the next three to five years will be covered by some form of privacy rights.

—Gabriela Zafor-Forutna, FPF

“Southeast Asia is a region that’s also very active in this space. I would urge folks to also keep an eye on Indonesia. Australia has had a couple of public consultations on the privacy law and also the data security regime, so we might see some action there as well.

“And Canada just last week published a comprehensive bill that covers both federal privacy law and provisions related to AI and data generally quite similar to the EU Data Act in the EU.“Here in the U.S., I’m hoping to see the successor to the Privacy Shield resulting from the U.S.-EC negotiations.

“Here in the U.S., I’m hoping to see the successor to the Privacy Shield resulting from the U.S.-EC negotiations.”

The ADPPA and Privacy Shield

Tene, with a bit of tongue in cheek, offers that we will see the EU and U.S. negotiate a successor to the new transatlantic data privacy framework after it too is struck down…by Schrems III.

Leblanc, however, predicts we will see a privacy shield replacement – and adequacy decision from the EC with all relevant approvals – in the next 6 to 12 months.

I’ll also put on the calendar that there will be heavy debate in the U.S. regarding Section 702 of the Foreign Intelligence Surveillance Act reauthorization  at the end of next year, concerning the extent to which it should be expanded potentially, or possibly even ended. 702 is at the center of the CJEU discussions concerning cross-border data transfers.


“The American Data Privacy and Protection Act (ADPPA), which is a ‘three corners bill,’ having the support of Republicans and Democrats in the House and Senate Republicans, is waiting for the ‘fourth corner’ – Senate Democrats — to rally behind it,” notes Tene.

The bill was introduced formally (21 June 2022), it even has a number now: H.R. 8152…you can’t overstate how big a deal this is, opines Tene. “It is broad, deep, and includes innovative concepts that we have yet to see anywhere in the world.

—Omer Tene, Goodwin

He further suggests that “CCPA/CPRA would basically be gone, except strangely (and ironically) for the provisions protecting employee data.”

Is Tene predicting it’s going to pass, then? “No, I’m predicting the Phoenix Suns will win the championship next time. I refuse to be bullied into making predictions.”

“I don’t want to be the naysayer here on the ADPPA but earlier today (3 June 2022) , Senator Cantwell made clear that she’s opposed to it and Senate Leader Schumer has said, there is no way that bill is going to be taken up in the Senate this congress, advises Leblanc.

At this point, there is a quickly closing window on the opportunity to actually consider any legislation – including the ADPPA – in the Senate because we are in an election year…and a third of the Senate (post August recess) begins to focus their attention on the November elections.

—Travis Leblanc, Cooley

And “if the Senate does flip from Democrat to Republican there’s going to be a mad rush to push through several confirmations and key priorities,” he continues. “There is a challenge, practically speaking, for floor time, even if it had the support of the Chair of the Commerce Committee, which it doesn’t.

The FTC and the States

On the regulatory front, Leblanc notes that “the Federal Trade Commission now has a fifth Commissioner(Alvaro Bedoya) giving Chair Khan a majority. We expect they will begin a process to promulgate privacy rules around privacy and security, including perhaps, updating COPPA.

“In addition to the FTC, I expect we’ll see some activity at the SEC which has advanced two rule makings related to cyber security. The one that’s getting the most attention is around the disclosure and governance controls associated with public companies in the U.S.

I do anticipate that a lot of activity at the state level as well. Assuming no preemption [there are] the CPRA regs that were recently voted on by the new California DPA covering issues from dark patterns to contractual requirements which are now in draft form and expected to be finalized later this year.

If you do business in California, I strongly encourage that you take a look at those and begin a process soon of coming into compliance with them.

—Travis Leblanc, Cooley

Colorado Attorney General Phil Weiser (who spoke with Justin) is also looking at regulations concerning issues like dark patterns as well. So, I predict will see something from Colorado in the next 18 months, says Leblanc.

On the tech front, Antonipillai predicts significant investment in Web 3.0. He also predicts less investment in n cryptocurrencies, but more in blockchain.

He further predicts that in the next 18 months we will see advances in how sensitive data can be shared and controlled for things like medical information with investments in this area driving critical innovation (see here and here).

The hot really topic, however, is artificial intelligence (AI).

I predict there will be at least one major step forward in AI in the next 18 months that causes all of us to feel like some version of it is almost sentient. We’re going to see technology that’s driven by AI almost mimicking the level of human thought and making it harder to even think about it from a regulation perspective.

A lot of regulation is trying to address transparency and understanding the way neural networks work. I predict we’re going to have steps forward in AI that makes it very hard to think about how you apply a law to it.

—Justin Antonipillai, WireWheel

“Legislators are trying to regulate the conduit of those that are building these systems,” says Zafor-Forutna. “For example, the EU tries to put in place some rules for providers of AI systems, but how much those rules will help, we don’t know. Perhaps the prediction for the next 12 to 18 months is that we all become a bit more literate in understanding the different shades of AI and machine learning.”

“There’s a lot of policy activity around AI in the U.S.,” says Tene. State laws have provisions concerning automated decision making and the ADPPA also has a very interesting regime around AI including requirements for businesses to do ‘algorithm impact assessments.”


Interestingly notes Zafor-Forutna, Brazil could be the first jurisdiction to adopt a comprehensive framework around AI.” There’s a proposal going through the congress on AI law right now. She also notes that Singapore, in a different approach, is looking to take advantage of existing regulations.

The challenges posed by Blockchain

“There is a fundamental tension in my view, between blockchain and some fundamental rights In Europe such as the right to be forgotten or the right not to have your data transferred,” avers Antonipillai.

I think there’s an even more fundamental tension, which is GDPR relies upon the assumption that a natural or legal person (a data subject) can enforce their rights which then relies on the assumption that they’re established in the EU. When you’re dealing with a distributed ledger or blockchain technologies you may not know.

—Travis Leblanc, Cooley

Perhaps this is an example of how technology like AI and blockchain are outpacing the regulatory systems that we set up offers Leblanc.

“It’s no surprise that regulations which were not adept to deal with the Internet, struggle with even newer technologies like the blockchain,” says Omar. “And the tension isn’t just with privacy law, it’s with other laws as well such as copyright or horrible stuff like child pornography which once on the ledger, can’t be deleted.

“There are some technological fixes to it, but I do agree with the premise that it’s difficult to stay on pace with technological development.”

But “the Groundhog Day for privacy professionals – the primary issue we deal with – is adtech digital marketing which is obviously under intense regulatory pressure all over the world. If the ADPPA passes, it has very strict limits on advertising technologies. And, of course, there is CPRA and Colorado.”

“If people can easily opt out in on place, others will be expected to do it, and that will significantly change the dynamics of the market.”

  • Marketing
  • Privacy Tech

How To Think About Buying Privacy Technology

Determining the best privacy technology for your organization can be overwhelming. As Alyce director of data security and governance, Jeff Mayse notes, “it touches every system, and crosses all borders.” In other words, choosing, implementing, and maximizing the value of privacy technology for both external and internal stakeholders requires careful consideration, planning, thoughtful execution, and management.

Joining Mayse to examine How to Think About Buying Privacy Technology, is his colleague Andy Dale, General Counsel and CPO of Alyce and WireWheel’s Director of Product Marketing, Emily Schwartz. Held at the 2022 Summer Spokes Technology Conference (June 22-23), this wide-ranging discussion was moderated by Kevin Donahue, Sr. Director, Privacy Engineering at Ethos Privacy.

Identifying the privacy tech value for external stakeholders

Often, clients will say, ‘we have to do data subject rights (DSAR) so we’re going to look at some solutions.’ But, there’s a whole lot of elements to doing data subject rights from ingestion, validating, and finding their data, to tracking things.

It’s not just one big black box, it is an entire workflow needed to satisfy their users.

—Kevin Donahue, Ethos Privacy

It’s important, suggests Dale, “to understand what it is we’re trying to do – not from a technical perspective and not from a GDPR or CCPA perspective – but the business problem or solution that we’re trying to get to. That helps all your privacy champions around the business get context and understanding.

“Law and regulation don’t really move the needle as much as folks like us might think it would. Risk surface area doesn’t do that either. What does move the needle is customer sentiment. “I don’t think everybody starts there. There is a tendency to dive right into spreadsheets and data mapping.”

Identifying the privacy tech value for internal stakeholders

“It’s really hard” he continues, “to work towards a successful outcome if everyone’s not on the same page from that 30,000-foot view. What are you trying to do and what are the impacts? This includes the internal stakeholders as well.”

“If you neglect any aspect here – external or internal– you’re at risk of pain. This is one of the most complicated systems that you will implement at a company,” warns Mayse.

It touches every system. It crosses all borders. It has no domains. And it can quickly spiral out of control.

You need to consider your internal users, at least as carefully as your external users. And when we talk about implementation, it is easy for this to become a problem.

—Jeff Mayse, Alyce

Mayse insists, when you understand what that process is like for both internal and external stakeholders, and what that would look like with privacy technology versus without, it’s easier to sell internally.

“As time wears on, I think it’s going to become increasingly clear that there are internal stakeholders benefits [and this] needs to be part of the equation,” adds Schwartz.

Buyer be aware

You can sell into legal. You can sell into a separate privacy function. You can sell into privacy engineering, into InfoSec, into the CIO, and you can sell into the IT department. There are too many vectors, too many personas to think through.

It’s a good idea to encourage the buyer, once they’ve engaged in that cycle of learning about the software, to bring a lot of relevant stakeholders to the table.

—Andy Dale, Alyce

That includes stakeholders like business intelligence and analysts who “need to pull data out of the product and manipulated somewhere: it is truly everyone,” notes Schwartz.

“When you start talking to every team in the company, the magnitude of the problem can become overwhelming,” cautions Mayse. “So, it’s not only important to bring in internal stakeholders…and raise issues early, but also to constrain the problem. You have to eat the elephant one bite at a time.”

It is critical for buyers to have this awareness.

Accepting that sales call without knowing what it’s going to take to implement a solution and how to perform the initial step of the development lifecycle when you’re evaluating build v. buy – if you don’t have somebody functioning in a product management role it can be very difficult to get it accomplished.

—Jeff Mayse, Alyce

Interestingly, both Dale and Schwartz have found that when it comes to awareness, the learning curve is much steeper in larger organizations than it is in the small and midsized firms (SMBs).

At Alyce, a privacy technology consumer (and WireWheel customer), they think about what they need to do first, relates Dale. “What engineering work needs to be done here to make sure that we’re even in a position to buy privacy tech?

“The issue is always, what do we have to solve first? What is our workload to implement a solution? There’s this myth about technology in general: buy it, turn it on, and magically it just happens.”

Privacy tech success metrics

How do you define success?

Privacy people think of success as ‘I’ve deployed a tool to do something, because I want to do that thing.’ And it’s oftentimes somewhat divorced from other stakeholders, whether it’s engineering, marketing, infrastructure teams, or even the users themselves.”

—Kevin Donahue, Ethos Privacy

“It comes down to why you bought it in the first place,” says Mayse. “What you wanted to achieve. You went in with a problem – maybe an intangible – but you have a problem that you’ve defined.”

“How do you track that problem now? Do you have a process for DSAR ticketing details and following through on the fulfillment through all systems and tracking time? Simply deploying a solution isn’t ‘the solution.’”

Pointedly, Mayse asks, “How did you measure the problem? Presumably you had to get dollars for this and justify the spend. How did you sell it internally (this is an $X/year problem, or a $Y/year risk surface). You can track against that.”

“One of the big ones for Alyce has been ‘time of effort.’ If it took you so many days/hours/minutes to fulfill [DSAR] requests, what does that look like after you buy the service? How many are you able to accomplish with how many people working on them?”

Sometimes it is a little bit of trial and error.

With experience you start to realize what is truly important. It may also be things like how the privacy technology integrates into your stakeholders workflows and processes. Is it efficient for your team? At the end of the day, a lot of this is about managing risk.

—Emily Schwartz, WireWheel

“Look at this through a risk lens,” offers Schwartz. “What are the building blocks that contribute to an acceptable risk profile for your organization and measure those.”

Project managing privacy tech: it’s all about communication

“There’s a big language challenge between legal and engineering,” opines Dale, “particularly in the privacy world where we live in the grey. Engineers and product teams, want the requirements. What are the milestones? What do we need to deliver, and the metrics behind it?”

“It’s going to be a little bit of iteration, trial, test, and learn as we go. But clearly discussing how this is a grey area can increase velocity in these projects.”

“You need somebody who can help translate the legal principles,” says Mayse. To draw lines in the grey legal sand and say “‘on this side of the line we think this is a good faith effort that’s compliant with the law’ while knowing that legal decision may indeed change. This must be translated into something that an engineer can build, or a product manager can run.”

Having embedded privacy champions in a development team really helps. It becomes a bi-directional communication between privacy (or compliance or legal), and the engineering team. In this way, engineering can communicate what they need from privacy to work effectively.

—Emily Schwartz, WireWheel

Evaluating privacy technology

“One of the most beneficial things you can do is ask your peers what they’re using, what they’re doing, what’s working for them and what’s not,” proffers Schwartz. “If you’re working with an agency or a consulting group, ask what they’re implementing for their clients. It shows what’s working what’s not working.

“You can do all the research in the world and see all the demos in the world, but just word of mouth and referrals are very powerful.”

My superpower recommendation is – while maybe not in the first call – when it comes to the demo, and especially to the technical discussion, bring an engineer who will ask questions you didn’t think of asking.

Those questions will be necessary, because an engineer is going to immediately start thinking about how they would actually do this…You can have an engineer say, ‘we won’t build this, it’s not going to happen,’ and you want to know that.

—Jeff Mayse, Ethos Privacy

“There is a lot of the tech… a lot of it does the same stuff, so I would have a list of your top needs and priorities,” avers Dale. “But if you’re not attentive to it and if you don’t have a person that is going to learn and administrate it,” he cautions,  you’re going to get nothing out of it.

“It can be the best software in the world, but any system requires a lot of TLC to get the most out of it. So go in knowing that it’s going to be a system you need to spend time with. Otherwise, you just won’t adopt it and you won’t get value.

“You’ll turn off that software and you’ll try another one. And if you don’t change something, the same cycle will happen again.”

  • Privacy Law Update
  • Regulations

FTC Issues Rulemaking to Protect Consumers

On August 11, 2022, the Federal Trade Commission (FTC) voted 3-2 to file an Advance Notice of Proposed Rulemaking (ANPR) regulating consumers’ privacy and data security. The rulemaking is called “Trade Regulation Rule on Commercial Surveillance and Data Security”.

From day one of her appointment, Chairwoman Lina Khan has been direct about having a heightened focus on outlawing unfair and deceptive business practices.  The ANPR is consistent with this anticipated posture.  The FTC has made it clear that privacy and data security issues particularly around children, AI, commercial surveillance, dark pattern practices, and lax data security practices, are central to the ANPR. The ANPR also includes civil penalties signaling the FTC’s attempt to regain its enforcement power.

The ANPR covers data collected directly from consumers, and data automatically collected by companies when consumers are on their websites and apps.  The FTC is considering whether new rules should be directed at the collection and use of sensitive/protected categories or if they should be applied more broadly to all categories of personal information.  Of particular interest is the ANPR definition of a consumer.  It includes “businesses and workers, not just individuals who buy or exchange data for retail goods and services,” This has already raised concern by many in the industry and is sure to evoke many comments.

This move is seen as the FTC’s attempt to formalize a national privacy regulation, though it does not appear to be as comprehensive as proposed federal legislation. Further, it does not preempt the current state privacy laws.  To that point, all five FTC commissioners have opined that they would prefer Congress pass a federal privacy law rather than the agency draft rules.

The FTC is asking for comments on 95 questions that address several key issues:

  • Automated Decision Making:  Measuring and identifying algorithmic errors
  • Balance:  Balancing the costs and benefits of privacy and security regulations with their impact on impeding or enhancing competition
  • Consent:  Understanding the effectiveness of consumer consent and contemplating different choices and standards for different types of consumers
  • Discrimination:  Regulating algorithms to prevent discrimination
  • Enforcement:  Empowering the FTC with more enforcement power
  • Governance:  Establishing rules for the collection and use of biometric information, facial recognition, fingerprinting, targeted advertising, and data retention
  • Harm:  Understanding how “commercial surveillance practices or lax security measures” harm consumers, specifically children
  • Notice:  Establishing rules for transparency, consumer comprehension, and privacy impact assessments
  • Security: Requiring technical, and physical data security measures and the certification of those practices

The rulemaking process the FTC will follow is guided by the 1975 Magnuson-Moss Warranty Act.  The next steps are to:

  • Allow for public comment on the ANPR
  • Issue a Notice of Proposed Rulemaking.  Comments may be submitted until October 10th
  • Host a public forum
  • Issue a final rule
  • Undergo a judicial review

The FTC is clearly establishing its privacy beachhead.  Will the ANPR move forward and become the national privacy framework or will it act as the catalyst for Congress to pass a comprehensive federal privacy bill?  We will continue to monitor this subject as it progresses and provide additional updates.

  • Privacy
  • Privacy Tech

Differential Privacy: Lessons for Enterprises from U.S. Government Agencies

Differential privacy is considered perhaps the only viable and formal way of publicly releasing information or allowing queries or analysis on data while protecting the privacy of the individuals represented in the data.

—Amol Deshpande, WireWheel

Enterprises trying to keep a rapidly changing regulatory landscape can benefit from looking at how their public sector counterparts have leveraged technological advances to publish data while respecting the privacy of individuals. While most organizations are not looking to publish data publicly, differential privacy can also enable internal data sharing between departments that are otherwise prescribed by regulation where accuracy or fitness for use of personal data is just as important. It is eminently translatable into commercial settings.

To discuss how differential privacy is being used, a group of experts met at the 2022 Summer Spokes Technology Conference (held June 22-23) to present their experiences in both academic and commercial settings.

The panel Differential Privacy: Lessons for Enterprises from U.S. Government Agencies was hosted by WireWheel Chief Scientist and Co-Founder Amol Deshpande. He was joined by the Co-Founders of Tumult Labs, CEO Gerome Miklau and Chief Scientist Ashwin Machanavajjhala, and Urban Institute Principal Research Associate and Statistical Methods Group Lead, Claire McKay Bowen.

Balancing insights with protection

Differential privacy technology seeks to solve the problem of balancing acquiring insights into groups of people while protecting facts about the individuals. For example, providing insight into a city’s taxi drop-off policies while protecting sensitive information such as location data.

Shared Group Insight
59th Street median rush hour weekday drop–off frequency: 145

Sensitive Individual Records
Customer (x456) traveled from LGA to 59th Street arriving June 1 at 8:30am.

The ability to share and reuse insights derived from data, while at the same time protecting individuals is notoriously hard.

And many of the methods that have been used in the past to share data, while protecting individuals privacy have fallen prey to a range of increasingly sophisticated attacks.

—Gerome Miklau, Tumult Labs

“It’s well understood that re-identification attacks can be very successful and lead to privacy vulnerabilities,” notes Miklau. “Aggregate statistics…are subject to reconstruction attack, even data used to train machine learning models are subject to membership inference attack. Differential privacy is a response to all of these attacks and provides a framework for reliable data sharing.”

How does differential privacy technology work?

Differential Privacy
A standard for computations on data that limits the personal information that could be revealed by the output of the computation

Miklau explains the application of differential privacy (DP) means that if one seeks to perform computations on sensitive data, the computation must be rewritten to satisfy the DP standard.

Once we do the computation in this standard-satisfying way, we get output that has a very formal and rigorous guarantee that [this output can’t] be reverse engineered back to individual records or facts, so we can reuse and share that output.

The privacy guarantee stands up in the face of powerful adversaries.

—Gerome Miklau, Tumult Labs

Importantly, it means that every attribute of the individual gets equal protection, so choices don’t have to make between sensitive and non-sensitive data, notes Miklau.“ Differential privacy as a technology for protecting privacy resists both current attacks that we know about and future attacks. It’s also ahead of regulation.

When running DP computation there’s a [tunable] ‘privacy loss parameter’ [the epsilon (ϵ)]…to set policy about how much information goes out when we publish, explains Miklau, and “allowing enterprises to do what we call privacy accounting:” to account for the tradeoff between data privacy and data accuracy.

Differential Privacy: Managing Cumulative Privacy Loss

“Major U.S. government agencies are excited about differential privacy for compliant data sharing, says Miklau. “The value drivers include:

  • Faster, safer movement of data enabling faster decisions making
  • The ability to perform institution-wide privacy accounting, and
  • A principled negotiation of privacy. There’s always a tradeoff between the strength of the privacy guarantee and the accuracy of outputs.

Importantly, as Miklau points out, the central use cases is the reuse and repurpose of data. This is valuable internally within a business enterprise as well. “When you have sensitive data…by computing a differential product summary of it, you can then move that around the enterprise much more easily and enable the use of data” for new insight and purpose.

IRS Case Study

The motivation for the collaboration between Urban Institute and the IRS is to advance evidence-based policymaking.

However, full access to these data are only available to select government agencies and the very limited number of researchers working in collaboration with analysts in those agencies.

— Claire McKay Bowen, Urban Institute

To expand access, the Urban Institute came into collaboration with the IRS Statistics of Income Division where it is “developing a synthetic data and differential privacy tools to allow researchers access to sensitive data, while providing robust privacy protections,” says Bowen.

There are two ways in which researchers have accessed data, explains Bowen: 1) from public statistics and data, and 2) the confidential data directly.

Data Flow with Statistical Disclosure Control

The above diagram describes the flow between the data user to a privacy expert (such as Bowen) and the curator (in this case also the data Steward) the IRS. This scenario is not ideal. As Bowen explains, it is very difficult to get direct access as it is constrained by clearance requirements.

Synthetic Data Technique for Data Analysis

Utilizing public statistics and data “isn’t a great option either. Given the growing threats to public-use micro data (i.e., individual record data) and concern for protecting privacy, it has led to government agencies progressively restricting and distorting the public statistics and data information being released.” Making it much less useful.

To solve this challenge, Urban Institute has devised two approaches.

  1. Improving the quality of the public data that is released using synthetic data generation technics, and
  2. Creating a new tier between the public data set and the direct access to data.

The synthetic data generation technique , says Bowen, has allowed organizations like the American Enterprise Institute and the Urban Brooking Tax Policy Center to conduct valuable micro simulation modeling to assess, for example, how much Medicare for all is going to cost the average taxpayer.

As written about here, synthetic data is not actual data taken from real world events or individuals’ attributes, rather it id data that has been generated by a computer to match the key statistical properties of the real sample data. Importantly, it is not the actual data that has been pseudoanonymized or anonymized. It is artificial data that does not map back to any actual person.

While synthetic data is a great solution for this application, “it’s not perfect, because you don’t know if there’s some other valuable analyses you didn’t quite account for when you built the model,” explains Bowen and this is where the “new tier of access that we’re developing called a validation server” comes in.

In this approach, the researcher i) enters in their analyses, ii) the analyses accesses the actual data iii) the output generated passes through the differential privacy mechanism (the “noise”), and then iv) analyses are provided with privacy protected.

“The privacy-utility tradeoff here is that you’re not looking at the original data but you’re still getting a result that (hopefully) is very similar to the original,” opines Bowen.

Bowen points to commercial use cases: For example, LinkedIn looking for ways to communicate their market data to different departments that didn’t  have access to the data or Uber trying to share their ride-share data internally using aggregation techniques to preserve privacy.

U.S. Department of Education case study

“The College Scorecard website is an initiative led by the U.S. Department of Education (DoE) to empower the public to evaluate post-secondary education options more empirically,” explains Machanavajjhala.

The department education has access to data about college costs, records describing students, and degrees attained, “but they don’t have the most valuable and interesting aspects of the website: the student economic outcomes metrics,” details Machanavajjhala. That information is with the IRS, and consequently, “the DoE must go the IRS and ask for this information every year.”

The IRS cannot just hand over the data to any other agencies as they’re bound by law to protect all information provided by tax returns – even the fact that somebody filed a return.

So, every year, the IRS has to deal with this problem of what should be released to the DoE, and how to transform the data to protect privacy.

—Ashwin Machanavajjhala, Tumult Labs

This challenge, notes Machanavajjhala, is becoming increasingly prevalent since the passage of the Evidence-Based Policy Making Act in 2018 and the consequent intensified requests for data sharing.

“The data custodian [in this case the IRS] is faced with a number of challenges,” says Machanavajjhala. “In past years, data protection was based on simple ad hoc distortion of the statistics and suppression. But the rules used were rendering most of the data useless (with upwards of 70% to 90% of the data redacted).

“Furthermore, the disclosure process was manual, so it was extremely onerous and becoming harder every year as analysts requested ever more data and detailed statistics.”

Solving the data sharing problem

To solve this challenge, Tumult Labs designed a three-step approach:

“Differential privacy is not a single algorithm,” cautions Machanavajjhala. “It’s an entire methodology. So, the right algorithm for your problem may require a significant amount of tuning to ensure the data that is output actually meets the needs of your application.”

Putting this in context of the IRS-DoE data sharing challenge, Machanavajjhala details that the DoE was requesting 10-million statistics involving 5-million students. Each of these statistics were either  a count or a median or quantity of different overlapping groups of students. A complex request and very fine grained.

Data Sharing Problem Solution

As “any privacy method is going to distort the data in some way…to ensure privacy of individuals but preserve statistics, it is important to know what characteristics analyst care about.” This requires establishing “fitness for use requirements,” which in addition to any specifically requested, will likely include:

  • A prioritized list of output statistics
  • The least privacy loss +
  • As much useful data as possible = (the privacy ϵ “negotiation”)
  • Error measures (“privacy accounting”) such as relative error, and number of records suppressed
  • Inconsistency between statistics, and
  • Ranking inversions

Tumult Lab developed software tools to enable this “negotiation” and enumerate the tradeoffs between privacy and accuracy when tuning the differential privacy “loss parameters (ϵ).”

Importantly, notes Machanavajjhala, deployment, should be community vetted, support peer review, enable safe privacy operations, and ensure there is an approval process to catalog and manage data releases, and audit and monitor them.

Ultimately, the Tumult differential privacy platform provided the IRS with a rigorous, automated, and quantifiable differential privacy guarantee and simplified decision making. All while enabling the release of more student income statistics than ever before with comparable accuracy and power the College Scorecard website.

Automated, quantifiable privacy guarantees with simplified decision making are important attributes for the privacy office and the business in commercial settings. But, as Bowen notes, “most different privacy technology is still found in highly technical papers so getting an expert on your team to filter through and figure out what is applicable what’s not,” is essential.

And as it is new technology, she advises “thinking about what training, education materials, and other support you need to get people up to speed.”

 

  • Marketing
  • Privacy

Implementing Consent Management for Agencies

Digital agencies often serve as the connective tissue between organizations and technology. And as data privacy regulations continue to expand and evolve, agencies will need to stay on top of how these regulations to continue to  responsibly, reliably, and effectively serve their clients.

Adweek reporter Trishla Ostwal’s sources tell her that “agencies are commonly being asked to address privacy in RFPs and clients are conducting data audits on potential agency partners. Clearly suggesting that privacy has become a deciding factor.” Her recent Adweek article notes that “media agencies are expanding their competencies and shifting away from leaning solely on legal teams” (Ostwal and Morley, 2022). [1]

To explore this transformation and its impact, Ostwal was joined by  President and Founder of website solutions company Digital Polygon, John Doyle and CPO of media agency UM, Arielle Garcia at the 2022 Summer Spokes Technology Conference (held June 22-23). The discussion, Implementing Consent: A Plan for Agencies was moderated by WireWheel VP of Marketing, Judy Gordan.

The industry perspective

Agencies are expanding their competencies and shifting away from leaning solely on legal teams. This traces back to GDPR, after which agencies invested (requiring investments in the millions) in building privacy departments.

—Trishla Ostwal, Adweek

“Some agencies have moved towards hiring data officers to keep up with all the privacy regulation changes,” notes Ostwal. “They have also gone on to build privacy task forces that span departments. Some collaborating with organizations like the IAB and others taking an inhouse approach and acquiring data companies.”

It comes down to client expectations, and “simultaneously, agencies are substituting for cookie deprecation with solutions such as data clean rooms. However, the industry is still in a nascent stage,” as it looks to develop that ‘one solution fits all’ approach.

The media company perspective

“That’s very much aligned to what we’re seeing on our side,” concurs Garcia.

“We have a privacy task force that spans all of our specialty units, and we collaborate with industry groups. The reason that we did all this (several years ago now) is because we knew that we needed to be there to support our clients.”

The reality of where agencies sit – and because there’s so much ambiguity in the law and different positions that everyone can take on the client side, the publisher side, the ad tech side, et al. –  is in this unique position of needing to play matchmaker. Understanding what the requirements, risk appetite, or interpretation that a brand is taking.

—Arielle Garcia, UM

“This has required us to really get into the weeds of understanding partner practice and a core reason we’ve started proactively having client privacy discussions,” she says.

“We created something called a client privacy primer dialogue that is now evolving into regular check-ins and being incorporated into quarterly business reviews because there’s so much change in this space.”

The development agency perspective

“There’s three key buckets that we’re seeing,” says Doyle:

  1. Understanding where your data lives and how it’s used by different groups within your organization. It’s not just cookies. It’s not just web. It’s not just consent…it’s also how you use that data, how you pass it to subprocesses, and where it’s stored.
  2. The facilitation of rights requests (such as DSARs) has to be compliant with all the new laws. And your website and applications facilitate implementation of these.
  3. Managing, tracking, and integrating consent, and ensuring consent is distributed to downstream systems. It is critical to have a central source for that consent as a lot of enterprise ecosystems are distributed and there can be a lot of duplicate data.

Additionally, different departments (say sales and marketing) might not talk about how they’re using information differently and it requires deeper data mapping and understanding of your ecosystem so that you can map out how consent is collected.

It goes far beyond just cookie consent and whether or not I’m setting third-party pixels.

—John Doyle, Digital Polygon

“Like everyone else, we’re trying to stay up to date with a law, the client’s interpretation of those laws, and how we translate that into implementation.”

The questions behind the question

“It’s definitely a mixed bag,” opines Ostwal.

“To say that it’s only the demise of third-party cookies that’s driving conversation would be inaccurate.”

It’s also a lot to do with the regulatory proceedings and the agency folks are asking “what are the differences? While there is a 65% semblance amongst the state laws, the 35% difference can cost them a lot to comply with each State. California alone is an investment in the millions….

And if you don’t adhere to all the provisions, you risk a penalty, so it’s a lot of money at the end of the day.

—Trishla Ostwal, Adweek

“I think [everyone] just wants to get things right. But to get things right at such an early point” is a real challenge.

“Privacy has become a catch all to encompass everything,” opines Garcia. “Regulatory developments, third-party cookie deprecation – just ID deprecation – and its limited availability, scalability, and addressability in the future. We are infusing that privacy related thinking in everything we’re doing.”

As we evaluate partners and their identity solutions, we’re asking the question behind the question. Everyone says that their ID’s consented, that they honor rights and preferences.

It means nothing until you understand the details of what that actually looks like. That’s what we and the brands that we serve need in order to make an informed decision on what a durable and responsible solution looks like.

—Arielle Garcia, UM

“It is early to get it right, but you’re going to be closer to right if you’re erring on the side that aligns with regulatory trends (which are just a codification of consumer expectations of choice),” asserts Garcia.

“There are these really interesting operational challenges that we’re navigating,” she says. “And the important part of all of this is that we’re steering towards a place that aligns to consumer expectation. Otherwise, we’ll find ourselves in this whack-a-mole environment until we get there.”

Who’s in on these conversations?

Garcia wants everyone in on the conversation: marketing, technology, media, privacy, and legal.

In my experience everyone’s very focused on their first-party data assets and they’re not necessarily as well versed in what’s happening downstream. This is quite challenging and it’s a big emphasis for regulators.

What we’re really trying to do is break down those silos where they exist. The good news is, I’m increasingly seeing more of that happening organically.

—Arielle Garcia, UM

Doyle sees the legal team reaching out to marketing and asking “are you complying with these new regulations? Tell me what data you collect? Which third-party systems you use? How are that data used? Making sure there’s an understanding of what’s being done and communicating it with their audiences.”

Unsurprisingly, organization impacted by GDPR brought more people to the table earlier while those based in the U.S. “are not worrying about it yet (because legal hasn’t yet come and yelled at them). More likely than not it’s going to turn into the same type of firefighting situation that we saw with GDPR,” submits Doyle.

Avoiding whack-a-mole

“The first step is getting a team together that’s going to own privacy at your organization,” avers Doyle. And it should be cross functional so you can gain a full understanding of the impact.

Organizations also need to understand the cost of implementing piecemeal vs. globally vs. facing fines if they decide not to implement.

Depending on what your organization does and how much revenue you generate from these different means – and across different States – that answer will be different.

—John Doyle, Digital Polygon

“I’m always going to fight for a user-first mindset. Trust is an economy that’s going to continue to grow,” asserts Doyle. “Sometimes that conversation is harder to sell to marketing executives who have to hit numbers and getting everyone in the same room helps build that consensus.

Garcia suggests that as marketing works to figure out their first-party data path forward, there is an opportunity to mitigate friction and align workstreams to support tech decisions that can serve both means and ends.

A preference center can enable greater consumer insight and facilitate zero-party data collection: there are some really unique opportunities that start to surface.

When you have all of the right folks at the table and are putting a consumer lens on all of this from a consumer experience and trust…it makes this more of an investment than a compliance costs.

—Arielle Garcia, UM

Honing the value of trust

There is an early group of actors that look to make privacy a differentiator, says Garcia, “like the Apple approach. I think we are increasingly going to see more of this as the realization settles in that everyone wants to optimize their first-party data collection and utilization.”

Ostwal avers that “for agencies, brands, publishers, there’s just one question: How do we hone consumers trust?”

They’re also trying out new solutions like a customer data platform or data clean room, or sometimes combining the two as we’ve seen in recent partnerships. But there’s still a lot of questions:

How do we maximize zero-party data? First-party data? And what’s the difference between the two?

—Trishla Ostwal, Adweek

As WireWheel’s Gordon notes, “this may be easier for some brands than others. For example, Brands that have a direct relationship with consumers versus those that are B2B.”

The traditional macro-level advice is if you have the opportunity to collect first-party data, get the infrastructure in place to have a single view of the consumer and the ability to respect their preferences and orchestrate that downstream.

If you’re a consumer packaging group, for example, you’ll probably be more reliant on smart partnerships. But there are still ways to enhance the connection with consumers authentically and data does become a byproduct.”

For a lot of brands, investments are focused on making the switch from third-party to first-party data. The key opportunity is to make privacy, consent, and preference the foundation. Then build your first-party data on top of that so you already have the leverage in place to control how data are used with different applications.

—John Doyle, Digital Polygon

“For companies that aren’t using a ton of first-party data, regulations will increasingly, impose obligations with respect to rights requests. So, even if you don’t think you’re using first party-data, it’s worth doing that data mapping exercise to understand the processing on your behalf by your partners,” cautions Garcia.

“If you’re not using a lot of first-party data you’re unlikely to have as robust an infrastructure and so it’s never too early to start thinking about what investments may be needed.”

[1] Privacy Orgs Are Media Agencies’ Next Big Investment (paywall)

  • Privacy Law Update

Privacy Law Update: August 15, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


Newsworthy Updates

Federal Data Privacy Legislation: Differences With State Laws Raise Preemption Issues

For over two years now businesses have been dealing with the complexities of compliance with the California Consumer Privacy Act (CCPA); the nation’s first comprehensive consumer privacy law. Compliance became more complex with the enactment of comprehensive consumer privacy laws in Virginia, Colorado, Utah and Connecticut, plus the new California Privacy Rights Act (CPRA), a/k/a CCPA 2.0.  As a result, industry has been screaming for one, consistent federal standard. Congress may finally be answering the call with the introduction of the American Data Privacy Protection Act, H.R. 8152, (ADPPA). The ADPPA in its current form would preempt most, but not all, state privacy and data protection laws.

Republican FTC Commissioner Noah Phillips to Step Down

Noah Phillips, one of two Republican commissioners at the Federal Trade Commission, is set to leave the agency in the fall, he told POLITICO.  Phillips’ departure comes at an extraordinarily high-profile moment at the agency, one marked by a heightened skepticism toward corporate consolidation and tension between the Republicans and Democrats on the commission under Chair Lina Khan, a progressive antitrust hawk who has targeted the tech giants and corporate concentration across the economy.

FTC Deepens ‘dark Patterns’ Investigation

Business Insider reports the U.S. Federal Trade Commission is moving forward with its investigation into alleged use of “dark patterns” by Amazon in its Prime services promotions. The agency sent subpoena letters to current and former Amazon employees as it seeks details on the potential deceptive and manipulative practices the company used to amass and maintain Prime memberships.

Final Decision on Meta’s EU-US Data Transfers Delayed

Objections to the Irish Data Protection Commission’s order to halt Meta’s EU-U.S. data transfers will delay a final decision, Politico reports. A DPC spokesperson said fellow data protection authorities raised concerns during the mandated four-week consultation under Article 60 of the EU General Data Protection Regulation and it may take months to resolve the discrepancies. If issues go unresolved, the Article 65 dispute resolution mechanism will be triggered.

Privacy Legislation

Federal Privacy

FTC Privacy Rulemaking: On Thursday, the Federal Trade Commission issued an Advance Notice of Proposed Rulemaking (“ANPRM”) on “Commercial Surveillance and Data Security” by a 3-2 vote. The ANPRM triggers a 60-day public comment period that will constitute a public record that the Agency will consider in determining whether to proceed with rulemaking. On September 8th, the Commission will host a virtual public forum on the proposal.

ADPPA: With legislative attention focused on the Inflation Reduction Act and August recess upon us, we have not seen any major public indications of progress on the American Data Privacy and Protection Act (ADPPA) from key policymakers. The most recent update is an August 4th report from Axios in which a spokesperson said that E&C Chair Pallone “is continuing to build broad bipartisan support and incorporate feedback from members, and is committed to seeing comprehensive national privacy protections signed into law.”

State Privacy

Colorado: The Colorado AG’s comment period for Colorado Privacy Act pre-rulemaking considerations closed on August 5. The AG is posting comments that it received here.

Montana: Sen. Kenneth Bogner (D) has submitted a bill draft request (LC0067) to state legislative services with the short title “Generally revise laws related to privacy and facial recognition technology.” The request is for the 2023 legislative session and it is unclear what scope such legislation may take.

New Jersey: S.332, a narrow privacy bill that would require websites to post privacy notices and honor opt-outs of sales (“monetary consideration”) was amended by Senate Majority Leader Ruiz (D) to clarify that the bill does not create a private right of action. This legislation was introduced in January by Senators Singleton (D) and Cody (D) and was reported from committee on a 3-2 vote in June.

Oregon: The Attorney General’s office submitted a draft comprehensive privacy bill (largely informed by the CO and CT privacy laws and a multi-stakeholder workgroup) to state legislative counsel. The AG’s office intends to move the bill in the 2023 session.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Marketing
  • Privacy

Customer Loyalty, Privacy, and Data Governance

Customer loyalty programs are the backbone of many companies, but they come with a host of data privacy traps, particularly with regard to the new state regulations which have the collection and use of data to effectuate these programs squarely in their crosshairs.

For a discussion of loyalty program privacy risks and opportunities, Blueprint Data Strategy Director, Mark Milone, moderated a panel at the 2022 Summer Spokes Technology Conference (held June 22-23).

The Customer Loyalty, Privacy, and Data Governance panelists included Cooley Partner and Cyber Data Privacy Group Vice Chair, Dave Navetta; Bob Seiner, founder of KIK Consulting and Educational Services; and Global SVP Revenue of loyalty technology firm, Annex Cloud, Erin Raese.

Customer loyalty is a hot topic

A Forrester Stat: 89% of organizations are investing in personalization and loyalty is just a really great way to collect the data that you need and to build that relationship with your customer and to deliver personalization.

—Erin Raese, Annex Cloud

Customer loyalty is a much hotter topic now because organizations are looking to deliver more personalized experiences. Consumers want the businesses to know who they are, put their needs first, and to make their lives easier.

“I had a conversation with a grocer last week,” relates Raese, “who had several requirements: ‘Can you create a discount at point-of-sale? Can you create a discount by product? Can you do discounts by this…and by that…?

“Sure, but why? What about the customer experience? What kind of customer experience do you want to deliver?”

You probably have email addresses which is great – that allows you to give them the discounts they were looking for. But what if you knew that Mary was a vegetarian and a gourmet cook?

What kind of experience could you deliver to Mary?

—Erin Raese, Annex Cloud

“And what if you knew that Mary had a husband, who was on a keto diet and a daughter, who had peanut allergies,” continued Raese. “What kind of experience could you deliver then?”

“If you think about it, the grocer could serve up recipes that fit everybody’s dietary needs. Mary could come to their website. They could curate all the ingredients for all those different recipes, and Mary could go click, click, click…and put them in her shopping cart.”

Bumping up against privacy-related issues

“Here we start to bump into privacy laws which are very much in flux right now,” cautions Navetta.

The fundamental question here is – with the regulations as they are today and cookies starting to become less of a viable means to gather useful information– many marketers are starting to think of loyalty programs as another rich field for collecting data.

But, perhaps also, some of the goals of these programs are not loyalty at all but harvesting a lot of personal information for bigger picture revenue goals.

—Dave Navetta, Cooley

“We have to make sure we’re addressing that and balancing out the requirements around privacy laws.”

A primary role of a data governance program is to manage that balancing act,” says Seiner. “Personalization is all about that customer data. Important considerations include ‘do customers know what data you’re collecting about them and how you’re using that data?’ Data governance can play a role in all of those things.”

The role of trust

It starts in trust, and then it’s really the company’s obligation to ensure that they respect that trust and are good stewards of the person’s data… It’s everything. Everybody walks into that experience looking for trust.

—Erin Raese, Annex Cloud

“The basis of loyalty tends to be a two-way dialogue. A two-way value exchange,” proffers Raese.

“There are terms or conditions, and they should be laid out so the customer knows – or should know – what they’re getting themselves into when they join. That they are going to give data and in return getting personalized experiences, recognition, or rewards in exchange.”

But “when you start to use data outside the parameters of those expectations: to collect data…to sell to a third-party…that starts to erode trust,” submits Navetta. “For the privacy conscious, understanding how the data is going to be used and who it is going to be transferred to are important. But it is also reflected in what regulators and legislators are ultimately requiring.”

Is it just for loyalty or some other purpose?

You’re not going to be loyal to somebody unless you trust them. This requires that customers have confidence in knowing how you’re going to handle the data. Collecting only that data you’re going to use is good, but oftentimes other data is collected along the way.”

—Bob Seiner, KIK Consulting

“You need to be able to explain what you’re going to collect, how you’re going to use it…and you have to have a strategy for communicating that back to the customer, avers Seiner.”

But, as Milone points out, “We don’t know all of the uses of the data generated at the point of collection. It could easily become a new use that wasn’t contemplated when it was collected from the customer.

Where companies don’t have proper governance, we see that after the program has been running for some time, someone in marketing realizes how much data they have, how rich it is, and conceives of new ways to use the data.

And that’s when you get into legal trouble.

—Dave Navetta, Cooley

Selling, sharing, aggregating, and de-identifying data

“I think we’re going to start seeing companies gathering first-party data and zero-party data and wanting to supplement it with other data,” opines Navetta.

And that constitutes a sale of data under certain laws even though no money has been exchanged. This goes to transfers and under the CCPA, for example, you have to provide an opt-out.

In addition, the laws are starting to require purpose and use limitations as well, which goes to reasonable expectations of the consumer….and this is where transparency comes into play.

—Dave Navetta, Cooley

That said, “one way to get more flexibility is to normalize the data,” asserts Navetta. To aggregate or de-identify it so it’s no longer ‘personally identifiable,’ and consequently, no longer subject to these privacy laws.

“Do you lose some of the value? This is what you’re always struggling with: the richness in the personalization tends to go away once you strip out the identifying elements.”

Can there be too much transparency?

Hopefully you are using transparency to engender trust, but at what point does transparency become too transparent? You can articulate every conceivable use of the data…and inundate your customer with terms and conditions.

How do you advise a loyalty program to balance transparency with the information the customer actually needs to make a decision about joining a loyalty Program?

—Mark Milone, Blueprint

Loyalty programs, and the use of data around them is becoming a much bigger issue than it was,” states Navetta, particularly due to cookie deprecation.

“The tendency is to be overly broad because, in the U.S. especially, if you’re notifying customers of data uses, you can use the data as stated without much trouble. Now, this new generation of regulations is starting to put more pressure around use limitations and may require an opt-in if you’re going to use information beyond expected uses.”

I think we’re still going to see broad and maybe overly complex notices to cover all bases, but over time – and as regulators start to clamp down – more precise notices that satisfy legal requirements but also engender trust.

—Dave Navetta, Cooley

What customer data, exactly?

“We look at it in three buckets,” says Raese:

  1. The information that you give when you sign up for a program
  2. Tracking your behavior when you’re making purchases or different types of interactions. A lot of the programs today will incent you to interact with the brand is (e.g., hashtag in social media, review writing, and award redemption), and
  3. Progressive profiling. The attempt to get additional information through, for example, surveys about what customers enjoy.

In this context, it’s not necessarily the sensitivity of the data. It’s the big picture of the data. If you collect a lot of data, you start to learn a lot about people from a privacy perspective and that causes issues.

Regulators and legislators look at the aggregation of [PII], and the inferences and insights that companies can get as a result. There are potential privacy violations that arise in those instances.

—Dave Navetta, Cooley

“What we’re starting to see, is the laws around loyalty programs are making it more difficult for companies to be able to achieve what they want to achieve without having to jump through some compliance hoops.”

Balancing the value and risk of a loyalty program

“If you were going to stand up a cross-functional team to help deliver value from a customer loyalty perspective but mitigate the risks who do you think should be on that team,” asks Milone.

All the stakeholders in that data.

You don’t want to have a cast of thousands, but you want to make certain…the right people are involved at the right time for the right reason with the right data to make the right decision.

—Bob Seiner, KIK Consulting

Who has authority and accountability? The organization, not single individuals, explains Seiner. “A lot of organizations still use the term data owner because it’s been built into the language, but more organizations are starting to refer to these people as stewards of the data.”

“We’re seeing that for most of the organizations that are being successful with this, it is coming from the top and it’s throughout the organization,” seconds Raese.

Data governance is the execution and enforcement of authority over the management of data and data related assets. But how are you going to get to the point where you’re executing and enforcing authority over the data?

Start to involve stewardship, which is definitionally the formalization of accountability.

— Bob Seiner, KIK Consulting

Loyalty program best practices

  • If you don’t really need the information, don’t collect it
  • Respect the data and respect your customers
  • Be aware that loyalty programs are on the radar of regulators right now and they are looking to make examples
  • Be aware that the new privacy laws that are coming online surrounding this issue
  • Understand the roadblocks you need to overcome
  • Legal and audit departments are your friends! Work with them
  • Partner with data governance to be sure you’re doing all the things you need to
  • Be purposeful and intentional with data

Listen to the session audio

  • Privacy

Accountable Executives = Accountable Privacy Programs

A Master Class in Establishing an Effective Privacy Program

DOJ Guidance on the evaluation of corporate compliance programs asks three critical questions:  1) is the program well designed, 2) is it earnestly applied, and 3) does it work in practice? Privacy authorities are increasingly adopting a very similar approach to data privacy program governance, implementation, and results.

As Information Accountability Foundation (IAF) president, Barb Lawler noted at the 2022 Summer Spokes Technology Conference (held June 22-23), “We know regulators – from California to Australia, and points in between – – are increasingly, not just interested in – but requiring organizations to prove that their comprehensive privacy programs are operationally effective, aligned with governance strategies, and accountable.”

This means corporate controls binding on even the most senior executives, needed investment across the organization, and a requirement for real-time performance data.

To explore the core elements and architecture of a demonstrable and accountable privacy program, Lawler and IAF Chief Strategist, Martin Abrams hosted Scott Taylor, Chief Privacy Officer of biopharmaceutical giant Merck & Co for a presentation on Accountable Executives = Accountable Privacy Programs which focuses on an in-depth case study of Merck’s program. It is a masterclass in establishing privacy program accountability, effectiveness, and demonstrability.

The Privacy Accountability Timeline

“A lot of folks think accountability came into fashion within the last five or six years, says Lawler, “and that really is not the case at all.”

Timeline for Accountability as a Privacy Governing Principle

“Looking back we can see that accountability as a governing principle for privacy and data protection dates to 1980 and the OECD guidelines,” notes Lawler. “And its first representation in national legislation was in Canada under PIPEDA in 2000 as a core principle organizations must follow. Then the APEC Privacy Guidelines of 2003, and so on, through the GDPR where accountability is interwoven throughout.

IAF’s Abrams and Merck’s Taylor worked together on “the accountability project” (at the time Taylor was CPO of Hewlett Packard), which was followed by Abrams’ work on the Essential Elements of Accountability (2009-20212): “A multi-stakeholder effort that brought together business, regulators, policymakers, academics, and advocates, defining what it meant to be ‘accountable,’ ‘responsible,’ and those elements necessary to actually demonstrating accountability,” says Lawler.

Elements of Demonstrable Privacy Accountability

“At the end of 2008, accountability had very little definition and that was particularly important for cross-border data transfers,” reminds IAF’s Abrams.

Taylor and I were actually meeting with a group of data protection authorities in Europe, led by the Irish Commissioner, and we said, ‘what if we had a global dialogue which really put some definition to what it means to be accountable when one is transferring data?’

The five essential elements came out of this process.

—Martin Abrams, IAF

What are those elements of accountability that demonstrate accountability? Lawler delineates:

  1. Organizational commitment (at the highest level) to fair processing, demonstrable accountability, and the adoption of internal policies consistent with external criteria and established fair processing principles.
  2. Mechanisms to put fair processing policies into effect, including risk-based adverse impact assessments, tools, training, and education.
    “Not check-the-box kinds of activities,” avers Lawler, but integrated, and supported by,
  3. Internal review processes that assess higher risk FIPAs and the overall fair processing program.
  4. Individual and organizational demonstrability and the mechanisms for individual participation “that are framed, defined standardized, and can literally be shown. “Think about some of the documentation requirements we’ve seen more recently in GDPR,” notes Lawler. And finally,
  5. The means for remediation and external enforcement.

These are the metrics an organization can use to describe for the regulator why they should think of them as a responsible and answerable organization, says Abrams.

Merck case study: the strategic framework

You could argue that the ‘what’ is really the same for all of us, but how we implement is very contextual to different companies and industries.

So, this is just one example. It’s not right or wrong. It’s not better or worse. It’s just one example of how we’ve interpreted [these principles] and tried to build them into an internal program at Merck.

—Scott Taylor, Merck & Co.

This strategic privacy framework is a reflection of what we were hearing from the regulators at the time, in terms of their high-level expectations of an accountable organization,” relates Taylor.

Any good program that’s accountable is going to have some type of oversight. The expectation is that all parts of the company that impact personally identifiable information will come together in shared decision-making that looks at both risks and opportunities.

You’re always balancing tensions between risks and benefits.

—Scott Taylor, Merck & Co.

Strategic Privacy Framework

Below that oversight layers are the three pillars that make up the traditional privacy program:

  1. Commitment: The policies and programs need to align external expectations (e.g., regulatory and consumer expectations) and be translated so that management understands it fully and can commit to transparency and accountability. “But I’ve always said that the ‘commitment pillar’ is nothing more than words if there isn’t something to put it into effect.”
  2. Implementation: This is the many different types of mechanisms to ensure those policies and commitments put in place are understandable to employees, and that their effectiveness is measurable both from a compliance and a business standpoint. “But implementation is a bit of a waste if you don’t have a way to validate that it’s actually turned out the way you expected.”
  3. Validation: More than just data indicating the commitment was correctly translated into action, it provides some of the best information in terms of elucidating any gaps you might have so you can continuously improve the program.

These three mechanisms supported by the overarching governance, opines Taylor, form the foundation of demonstrability – to both internal and external stakeholders – of the organization’s privacy program commitment and accountability.

“As simple as it may seem, everything anchors back to it,” says Taylor.

Merck case study: implementation

Accountability starts with accountable people…and for things to be truly accountable, people have to be measured on their success in upholding their piece of accountability.

—Scott Taylor, Merck & Co.

Accountability at Merck begins with the Corporate Compliance Objectives, relates Taylor. The set of objectives senior executives are measured against are very specific in terms of what and how. It is done on an annual basis, and it impacts compensation.

Importantly, cautions Taylor, “If you’re going to have a high-level objective that could impact people’s compensation, then it needs to be structured very well.”

The “binding mechanism” is the Merck Privacy Function Deployment Standard: the standard operating procedures (SOPs) that set out ten (10) elements for which senior executives are accountable. These SOPs detail what, how, and crucially, the tools and resources – including the assignment of Privacy Stewards – to support these efforts.

Supporting, and ensuring privacy is effective across all the businesses, Merck has 209 Privacy Stewards around the world. Each of whom undergo self-assessments against the 10 elements. The stewards spend anywhere from 25% to 100% of their time supporting privacy “in a very specific way.” Taylor details that Merck “provides their privacy stewards with very specific implementation standards and processes.

Privacy Program Flowchart

The privacy implementation network at Merck (at a high level) comprises:

  • The Governance Body (comprising senior representatives appointed by the highest levels at Merck)
  • The Central Privacy Office which oversees all aspects of that framework strategy and end-to-end management supported and held accountable by
  • Critical Partners such as Procurement, Legal, Internal Audit, and Privacy Stewards

All of whom act in concert in a shared accountability with whom the Central Privacy Office maintains continuous bi-directional dialogue. This forms the people strategy, says Taylor. And everyone “operates off the same song sheet. No matter who you go to, you’re going to get the same answer.”

Finally, a program such as this cannot be done manually. It requires well-developed standards, policies, and procedures, the control sets, and crucially, the technology to support it.

It is complex, admits Taylor, and to manage that complexity Merck utilizes assistive technology, “in everything we do. So, “pretty much every requirement has a tool and an underlying process to support it. We’re trying to take manual out of the process and facilitate through workflows and technologies.”

As Taylor emphasizes, accountability is not “words on paper or pretty slides.” Rather it is a carefully architected program requiring the commitment of people, well-designed process, robust toolsets, and the binding mechanisms that transform intent into earnest application that works in practice. As Taylor notes, it is “complex,” but the equation is simple:  Accountable People (+ the processes, tools, and binding mechanisms to support them) = Accountable Privacy. The results, of which there are many, include:

  • Effective governance and oversight
  • Greater visibility into processes
  • Enhanced analytics
  • Data-based decision making
  • Solid metrics and KPIs
  • Enhanced risk identification and mitigation capabilities
  • Improved turn-around times (e.g., PIA, DSARs, ROPA)
  • Improved third-party due diligence
  • Better role-based privacy training
  • Ability to implement timely and continuous improvement, and
  • Transformation from reactive to proactive data privacy operations

The Merck case study truly is a masterclass in establishing privacy program accountability and demonstrability and what a mature program can be. It should be required viewing by all privacy professionals. Don’t wait. Access it here now.

Listen to the session audio

  • Marketing
  • Regulations

Evolution of Consent and Preference Management

The U.S. is really moving away from just that little cookie banner at the bottom to trying to think through all of the different choices you have to effectuate consent.

It’s raising very complex user experience questions.

—Justin Antonipillai, WireWheel

The granularity of evolving consent requirements, differences in definition and requirements across state laws, the added complexities of managing consent across multiple channels, and other factors have certainly placed a heavy burden on the adtech industry, publishers, and brands.

Now, increasing attention is being paid regarding the burden consent and preference management is placing on consumers and the deleterious impact to the user experience.

To discuss the evolution of consent and preference management, how we got here, and where it is going, WireWheel Founder and CEO Justin Antonipillai moderated a discussion at the 2022 Summer Spokes Technology Conference (held June 22-23).

Joining Justin to discuss Consent and Preference Management Across the Globe were BBB National Programs Senior VP, Privacy Initiatives Dona Fraser, and Ruth Boardman Co-Head of the Privacy Practice at Bird & Bird. Boardman is currently on the board of directors of the IAPP, and a member of the UK government’s Export Council, advising it on data transfers.

The evolution of request for consent

The pre-GDPR Cookie Banner was “an overlay and just an invitation to click OK. Very unobtrusive,” begins Boardman:

The Drum's Privacy Policy

“But they are changing. They are getting bigger and giving more choice. This banner [illustrated below] has a choice of accepting all cookies or accepting only essential cookies.”

Cookie Consent Management

“This next example is a good illustration of an approach which came in with GDPR, but which is increasingly being challenged:

“The idea here is that you have a brief overlay on the homepage. Then, if you click through, you bring up the more detailed information where there’s a list of the particular purposes and third parties.

“The choice is to ‘accept’ or ‘manage cookies.’ To say yes to everything or to go into more options (including saying no.)” And as  Boardman observes, the use of color and the complicated process to exercise more control, nudges the user towards accepting everything (what could be called a dark pattern).

While quite common when GDPR became applicable in 2018, it is increasingly being challenged.

One requirement of the GDPR is that it should be as easy to withhold or withdraw consent [Article 7] as it is to give consent. Pressure from privacy activists and from data protection authorities is that this kind of user interface – requiring multiple steps to exercise choice – is arguably unfair, because you’re playing on the subconscious to nudge into accepting.

—Ruth Boardman, Bird & Bird

How these changes played out can be seen in the before and after illustrated below. You can see that Google has moved off ‘agree or customize’ to ‘accept or reject all.’ Notably, the options “are mutually positioned, the same color, and the same size,” notes Boardman. You also have ‘more options’ to exercise more sophisticated control.

The drivers of consent evolution

The evolution of consent management has been a combination of a number of factors:

  1. Most importantly, it’s the law. “It’s a combination of legislation and the ePrivacy Directive” which says that using cookies or cookie-equivalent technologies – in fact, whenever information is stored or retrieved – you need consent unless it is for essential purposes. There is also a requirement for consent, actually dating back to 2011 reminds Boardman, if you’re doing (in broad terms) cross-site targeting. However, what consent means was altered with the GDPR and “that’s what’s driving this evolution.

  2. This legislation has been coupled with regulatory guidance from supervisory authorities including the ICO (UK), CNIL (France), DSK (Germany), AEPD (Spain), and others. All “requiring much more transparency and much more granular user control.”
  3. There have also been a series of cases – some going to the CJEU (e.g., Planet 49) – as well as a series of complaints by the “lobby group” nyob founded by Max Schrems that has been really influential.
  4. Industry guidelines, in particular the IAB transparency and consent framework (TCF) developed by the adtech industry designed to allow adtech participants to prove they meet GDPR obligations by demonstrating consent.

Why cookie banners look the way they do

The reason the evolving appearance of cookie overlays look the way they do are a function of the detailed GDPR consent requirements, says Boardman. Namely:

  • Consent must be specific and informed. “The individual needs to know the particular purposes for which they are giving consent at a detailed level,” such as distinguishing between cookies for analytics or targeted advertising.

Boardman notes that “the TCF goes even further and breaks it down into consent for targeting to display content versus targeting to customize ads versus consent in order to carry out measurements or attribution purposes, for example.”

  • The identity of every party relying on consent must be specified. This is why there are multiple screens to reference your partners and linking to a list that typically includes hundreds of parties.
  • The “consent has to be demonstrable and unambiguous” and requires “clear affirmative action.” A key driver for the move away from the simple banner reading, “’by continuing to use this site…’ which infers consent which does not provide demonstrable proof. And lastly,
  • Consent has to be freely given and revokable without detriment specific to different processing operations; service cannot be dependent on consent; it must be as easy to withdraw as it is to give consent; and it must be separate from other terms.

“Revokable without detriment impacts the ability to have cookie and pay walls,” says Boardman. “There are currently cases pending and on their way to the Court of Justice, looking at if you try and have paywall how much can you charge per month per user before this starts to be a detriment.”

Tough on adtech, tough on consumers too

This has an implication for user experience that can be equally burdensome. There is a burden by not having granular choice, but having granular choice also is a burden to the data subject because the consumer has to look at a lot of information.

Breaking down individual choices and processing in that granular way means that consumers must interact multiple times before they get to what they want to do.

—Justin Antonipillai, WireWheel

“It does impose a burden on the user. But my experience has been that when organizations try to raise that argument… and ask for consent in a lighter touch global way…it doesn’t get a very sympathetic hearing,” opines Boardman. “The response is ‘maybe you shouldn’t do as much intrusive processing’…and to push the challenge back to industry.”

The proposals recently published in the UK pick up on this, says Boardman and “as a first step, proposes that you don’t need to ask for consent for analytics cookies but this is coupled with the requirement that consent won’t be taken out unless and until various well-developed technologies allow users to have that degree of control.

“The difficulty with the current approach is that it has clearly been designed to meet the obligations to prove consent in a way which is very granular,” and it is clearly designed for this purpose and not the user. “That’s the challenge.”

“There seem to be a fair number of assumptions that consumers understand all of this,” opines Fraser. So, we’re putting all these choices in front of them presuming they know what any and all of this actually means.

For me, if it’s a choice about having advertising targeted to me that may actually be a distraction…it’s why I’m on the site in the first place. People process things very differently.

Even for those of us who understand it is overwhelming sometimes to the point where ‘did I just opt-out or what did I just opt-in to. And more importantly, how do I know my choices are even being honored?

—Dona Fraser, BBB National Programs

WireWheel Consents and Preferences Flowchart

The consent and preference infrastructure

Antonipillai proffers that you have to think about technology that allows you to bring in consent and preference signals from multiple channels: not just web or mobile APP,  but connected TVs, cars, and IoT devices.

This means having a way that you can look at a single universal consent and preference solution.

And not only capture those consent signals but prove it and have the record keeping behind it. One benefit from the consumer experience perspective is that by unifying the signal, you gain the ability to move beyond one channel and having to capture it over and over again and begin alleviating the burden on consumers.

But it takes more than just a cookie tool, it takes a central platform to actually look at the choices across your channels and brands.

If the notion is that consent can lead to better customer data information, isn’t that what companies want so they can build that relationship? Build consumer trust?”

But, having that first-party user data – and being able to use it to the best of your ability to build that relationship – also means knowing you have a greater responsibility with that data.

—Dona Fraser, BBB National Programs

It’s still about trust

Fraser notes that most of the companies BBB National Programs deals with are international companies who are trying to create a streamlined process, not just for their users, but for their internal backend systems as well.

If they’re trying to create one website, one mobile APP, that’s doing it all everywhere, knowing that they have to comply with a myriad laws, it’s a huge challenge and a burden. But that said, your organization’s privacy program commitment to privacy and data ethics is the larger question.

If your organization is not first committed to dealing with this on a day-to-day ethics level with transparency, the consent management process isn’t going to work. It’s not going to have the veracity that users need in order to share their data willingly.

The challenge that we are still going to see is explaining to consumers why they’re opting in.

— Dona Fraser, BBB National Programs

“The fact that you just want to browse a website and are faced with these questions and procedures can be an overwhelming experience,” continues Fraser, and “technology may offer a way for us to streamline this, but state laws are going to force our hand. “The problem is the cost of doing business,” she says.

“BBB National Programs tends to work with small to medium sized companies and they don’t necessarily have the resource for dealing with this. They struggle to go beyond just checking the compliance box and look to manage customer relations in another way, but I don’t think companies can separate that anymore.”

Listen to the session audio

  • Privacy Law Update

Privacy Law Update: August 8, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


Newsworthy Updates

CJEU Rules on Interpretation of EU GDPR Special Categories of Data

The Court of Justice of the European Union rendered a decision clarifying how indirect disclosure of sexual orientation data is protected under Article 9 of the EU General Data Protection Regulation. The court explained such data disclosure falls under the special categories of personal data in Article 9 after consulting Article 4(15) provisions for “data concerning health.” TechCrunch reports on how the decision could have wider implications across a variety of online platforms.

How CPOs Can Protect Medical Data Privacy in a Post-Dobbs America

When the US Supreme Court overturned the landmark 1973 Roe v. Wade decision last month with their decision on Dobbs v. Jackson Women’s Health Organization, it immediately raised the stakes on medical data privacy for individuals and their employers. It also increased the importance of protecting medical data privacy for a wide range of healthcare-related businesses, including: insurers, healthcare providers, and the makers of fitness trackers and wellness apps – and especially fertility tracking apps.

Meta Repeats Why It May Be Forced to Pull Facebook From EU

Meta Platforms Inc. reiterated its warning that it may have no choice but to pull its popular Facebook and Instagram services from the European Union if a new transatlantic data transfer pact doesn’t materialize.   Meta could face an imminent data flow ban from Ireland’s data protection watchdog, which oversees a number of Silicon Valley tech giants based in the country, in a decision that risks impeding transatlantic data flows. The Irish Data Protection Commission could issue its decision on a possible ban of EU-US data transfers under so-called standard contractual clauses in the next three months, Meta said in a regulatory filing.

India Nixes Privacy Legislation

India’s government on Wednesday withdrew a data protection and privacy bill which was first proposed in 2019 and had alarmed big technology companies such as Facebook and Google, announcing it was working on a new comprehensive law.

Privacy Legislation

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo