Blog

  • Privacy Law Update

Privacy Law Update: October 3, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


Newsworthy Updates

White House Executive Order On Trans-Atlantic Data Privacy Framework Imminent

U.S. President Joe Biden is expected to publish an executive order concerning a new agreement on EU-U.S. data flows as early as Oct. 3, Politico reports. According to individuals involved in negotiations, the order will cover new legal protections over personal data access and use by U.S. national security entities. Principles for necessity and proportionality in relation to government surveillance activities are included in the order. Once the order is published, the European Commission will begin a ratification process that could take as long as six months to complete.

Why Closer Collaboration Between CPOs and CISOs Benefits Everyone

If we’re to be more proactive in identifying and preventing privacy and security risks, CPOs and CISOs must work together now more than ever. Security teams can’t protect personally identifiable information (PII) like names, Social Security numbers, home address, phone numbers and personal email addresses if they don’t understand what and where the information is; and privacy teams can’t exist in a company without the security controls in place to protect PII.

California Passes Stringent Kids’ Privacy Rules

Continuing its push as the nation’s first-mover on privacy, California has passed a bill that will require potentially significant new privacy commitments from online services that are “likely to be accessed” by children under 18. Covered companies have until July 2024, when the law takes effect, to assess their practices and come into compliance. In addition, implementing regulations due in January 2024 will give specifics on compliance.

CPPA Board Chair Doubles Down On Proposed American Data Privacy And Protection Act Opposition

In an op-ed for The San Francisco Chronicle, California Privacy Protection Agency Board Chair Jennifer Urban reiterated the agency’s position on how the proposed American Data Privacy and Protection Act would “undermine” Californians’ privacy rights and businesses’ “ability to confidently invest in more privacy-protective practices.” Urban said companies “may be understandably confused about how to invest if Congress overturns this existing guidance” under the California Consumer Privacy Act. She also noted how federal preemption would discontinue states’ ability to “experiment more nimbly” with legislation and react to emerging trends.

Data Privacy Can Give Businesses A Competitive Advantage

Data privacy isn’t just about compliance – it’s turning into a marketing and operational advantage for many businesses. Data privacy can give businesses a competitive advantage. Staying GDPR compliant gives companies an advantage over rivals as they are beginning to forge more trusting customer relationships which they fully expect will deepen loyalty and drive up the bottom line, the General Data Protection Regulation (GDPR) is a challenge, but strong data privacy opens up the opportunity for strong advantage over the competition, such as improved customer loyalty and more efficient operations.  The negative headlines around GDPR — such as Amazon‘s fine earlier this year, the largest issued of its kind to date — can encourage businesses to see compliance as a burden. The truth is, it can be an opportunity to win and retain new customers if you can turn respect for consent and protection of privacy into competitive differentiators.

Privacy Legislation

California: On Friday, September 23, the California Privacy Protection Agency held a board meeting to discuss various administrative and rulemaking topics. As expected, there was no announcement on delaying either the CPRA’s enforcement or effective dates; however, Board Member Le suggested that (1) the Agency could request from the legislature the ability to provide more direct guidance to businesses (without running afoul of restrictions on ‘underground rulemaking’), or (2) the Agency could promulgate a regulation expressly recognizing that a delay in finalizing the regulations is a “factor that the Agency may consider” when deciding whether to initiate an enforcement action or offer an opportunity to cure. [Note that the California legislature is currently out of session]

There was also significant discussion of the rulemaking process, particularly the procedural complications and hurdles that will be raised in the interaction of both the California APA and the Bagley-Keene Open Meeting Act. Executive Director Soltani urged the board to give a strong signal on the timeframe for meetings to advance the draft regulations, mentioning October and November (suggesting to us that the Agency may still hope to finalize initial draft regulations by end of year). Soltani further stated that staff is “burning the candle at both ends” working on the rules and that “there will likely be quite a number of changes [to the draft regs] in response to comments.”

The CPPA has also posted the public comments that it received in response to its initial draft implementing regulations. There are 102 total comments spanning well over 1,000 pages.

Michigan: On Tuesday, September 27, Senator Bayer (D) and 8 Democratic co-sponsors introduced SB 1182, the “Personal Data Privacy Act.” While this comprehensive privacy bill generally follows the ‘Virginia-model’, it includes a data broker registry and provides for a private right of action that, similar to ADPPA, would require prior written notice to the party alleged to be in violation. While the bill is unlikely to move this late in the session of a Republican controlled chamber, we are interested to see whether it represents a new trend of state privacy proposals incorporating elements from ADPPA.

New York State: On Friday September 23, Senator Gounardes (D) introduced S9563, “The New York Child Data Privacy and Protection Act”. While the Act contains some similarities to the recently enacted California Age Appropriate Design Code, it would go much further in numerous respects, including:

  • Not requiring age estimation, but instead applying to all “online products” targeted towards (accessible to and used by) child users.
  • Requiring an expansive risk assessment for any new online product (including services, features, or platforms) be submitted to, and approved by, the state AG before the product can be made available to consumers.
  • Empowering a new AG Bureau to ban autoplay videos, in-app purchases, push notifications, prompts, or other features for particular products that it chooses
  • Requiring online products to prioritize civil and criminal subpoenas and criminal warrants when a child user has been a victim of a crime.
  • Creating a private right of action.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • CCPA & CPRA
  • Regulations

CCPA, CPRA, and CPPA: What You Need to Know

In a widely attended webinar CCPA, CPRA, and CPPA: What You Need to Know, WireWheel Chief Privacy Officer Rick Buck discusses practical insights into the CPRA and how CPRA’s (still evolving)  requirements will impact your business from consent management, opt-outs, GPC and PIAs to employee data, Notice requirements and website real estate.

Rulemaking and Regulations

Draft regulations for the CPRA were issued in July of 2022 and public hearings concluded August 25, but there is still some open commentary and debate, and as such, the regulations are not wholly conclusive.

Notably, the CPPA has not yet addressed in full detail the rules on cybersecurity auditing, risk assessments, or issues concerning automated decision-making technologies and uses.

The CPRA amendment to CCPA enhances Californians’ privacy right and created the California Privacy Protection Agency (CPPA) which will be responsible for enforcement; contemplates consent in specific use cases; and requires more transparency in both notices and at point of collection.

Perhaps one of the most impactful provisions of the CPRA on businesses is the requirement to perform “regular” privacy (PIAs) assessments:

The CPRA directs the Agency to issue regulations requiring businesses “whose processing of consumers’ personal information presents significant risk to consumers’ privacy or security” to 1) perform annual cybersecurity audits; and 2) submit to the Agency regular risk assessments regarding their processing of personal information.

California Privacy Protection Agency (Call for comments, 2021)

As tempting as it may be, framing existing state laws, including California, as “GDPR light” would be a misstatement. But it can be helpful to use GDPR as a basis of comparison when thinking about the CPRA regulatory framework.

For example, borrowing from the GDPR, California has addressed the concepts of data minimization and data appropriateness: collection and data use that is “reasonably necessary and proportionate” for the processing activities for which you are going to use it.

These concepts appear in the regulations as:

  • Consent in specific use case requirements
  • Notice and what proper notice entails
  • PIA requirements for risky data processing activities by your company and your service providers, and that those  third parties should be obligated under contract to be compliant with the law.

Who needs to comply?

The CPRA applies to any business that process the personal information of consumers in California, and more.

However, personal information has a pretty broad definition. Although it contains all the usual suspects:

  • Name
  • Home address
  • Email address
  • Date of birth
  • Passport or social security number
  • Biometric data

  • Geolocation and other location data
  • Records of products purchased
  • Internet browsing history
  • Digital fingerprints, and
  • Inferences from other data that can
    be used to create preference and characteristics profiles

Moreover, the inclusion of “inferences” significantly complicates the definition of personal data itself.

Personal information is really complicated, because if it can be used in any way to be linked up with other data to re-identify an individual, it is considered personal information

—Rick Buck, WireWheel CPO

Enforcement

The CCPA is enforced by the California Office of the Attorney General while the CPRA will be enforced by the new California Privacy Protection Agency (CPPA) with full investigative, enforcement, and rulemaking authority.

One  interesting difference between the CPRA and the CCPA concerns the 30-day cure period. Under the CPRA if you are approached by the agency for a violation, you will have 30 days to remedy that violation to avoid penalty. The CPPA removes this latitude effective January 1, 2025.

Also of particular note is that the exemption for employee/HR data was not extended  counter to widespread expectation. Consequently, beginning January 1, employee/HR data will be considered consumer data and fall within the scope of CPRA.

It is anticipated that California will enforce in a vigorous and highly visible manner.

Consumer Rights

Under the CCPA rights included access, consent, equality, deletion, and portability. Under the CPRA new rights are enumerated including the right to correct; opt-out of automated decision making; access to information about automated decision making; and the right to restrict use of sensitive PI.

Rights Under CCPA

  • Access
  • Consent
  • Equality
  • Deletion
  • Portability

New Rights Under CPRA

  • Correction
  • Opt-out of automated decision making
  • Access to info about automated decision1
  • Making Restrict use of sensitive PI

Consent under CPRA

While GDPR has a very stringent definition of opt-in – it needs to be freely given, specific, informed, and unambiguous – the CPRA provides a looser interpretation. Another difference: the GDPR requires opt-in for all collection of data, while the CPRA only requires opt-in for specific types of data.

The GDPR and CPRA also differ in their treatment of implied consent versus expressed consent: The GDPR does not recognize implied consent. So, for example, in Europe, a pre-checked box would be implied consent.

However, the CPRA is in no way an opt-in the law. But it does contemplate opt-in for specific use cases. Note that users have the ability to opt-out of collection and use of even previously opted-in data (and vica-versa).

*If you are selling or sharing the personal information of minors, you need consent of their parent or guardian.

What CPRA regulations say about Opt-Out

Right now, under CCPA, in addition to saying do not sell or share my personal information, the new regulations require a link to ‘opt out of sensitive information’ be presented. And much like the ‘do not sell or share’ my personal information link, it needs to be conspicuous. And it needs to point to a web page that gives complete information on how consumers can better make their choices.

Opt-out signals must have a frictionless option. And one example of a frictionless option is the use and acceptance of global privacy controls (GPC).

Regulation goes on to say, ‘look, we understand that we’re asking you to put both a do not sell or share my personal information button and an opt out of sensitive information button there.

We understand that there are real estate issues on your web pages. To make that easier, you can combine the links. And the link must say “Your Privacy Choices” or “Your California Privacy Choices” to effectuate both the do not sell and an the opt out of sensitive information.

—Rick Buck, WireWheel CPO

 

The link must also direct consumers to a website with detailed information on how they can very easily understand what that’s all about.

If my organization has implemented GDPR for EU operations, then the right to opt-out and restricted should already be accounted for within the processing, right?

Where GDPR has fully comprehensive opt-out rights, in that specific instance, if you are fully complying with your subject access rights under GDPR, you’re likely to be fully compliant under GDPR. But I would be very reluctant to tell you that complying with GDPR means that you are complying with CPRA.

Don’t rest on your GDPR compliance in any other jurisdictions where you need to be compliant.

—Rick Buck, WireWheel CPO

CPRA Privacy Policy and Notice Requirements

CPRA states your privacy notice needs to tell California consumers how they can request their rights. If you sell, then there must be an information link.

You need to list the categories of PI collected or sold in the past twelve months. You need to disclose the types and uses of sensitive PI, their sources, categories, and the purpose for each. Furthermore, your policy needs to have an effective date and be updated annually.

You must also provide notice concerning PI being used for purposes beyond the use originally disclosed or for purposes not authorized under CPRA.

The privacy notice needs to provide information on all new CPRA rights – including explanation of how opt-out signals are processed – and do so in easy to comprehend language.

Companies should really take a hard look – with the privacy, marketing, and legal teams – at what they’re doing with data. Is that data considered sensitive? (Remembering that sensitive data is defined differently across the states.) Are your data processing activities considered sensitive? If so, are those data processing activities considered the sale of data?

—Rick Buck, WireWheel CPO

If you can definitively say with a very high degree of confidence that you are not selling or sharing data and not processing sensitive data, then you should very clearly disclose that.

CPRA on Notice at Collection

As noted, the categories and purpose of the PI and whether it’s sold or shared must be disclosed under the CPRA and the regulation goes on to be a bit more specific, stating that if data is shared or sold then notice is required at or before collection. Importantly you must also disclose how long each category of data is retained.

Furthermore, if you allow third parties to collect consumer data, you must list all those third parties who must also disclose on their homepage.

Privacy Impact Assessments under CPRA

What we know now is that PIAs are required for any data processing processing that creates significant privacy or security risk to a consumer such as:

  • Sensitive data
  • Marketing to minors
  • Targeted advertising, and
  • Selling/sharing PI

PIAs will now also be required for third parties. Not only are you accountable for complying with the law. But any of your vendors that process data on your behalf – even if they don’t monetize the data – are also responsible for upholding CPRA compliance and need to cooperate with you in this regard.

Consequently, you not only need appropriate contractual language, you also need to perform proper third-party due diligence.

Perhaps most importantly, under the CPRA, service providers and contractors are prohibited from combining any PI they receive from businesses with PI from other sources, or their own. They can’t aggregate that data or monetize it.

Again, I would not say that CPRA is GDPR light, but I like that the CPRA and other States who have passed privacy laws are least starting to align with the way Europeans are thinking about data.

  1. The requirement that PIAs are done for high-risk processing and include third parties
  2. The concept of data minimization (only using data that is reasonably necessary and proportionate)
  3. Use case limitations
  4. How long data should be kept, and that
  5. Consent and security are important to the privacy story now

But remember, being GDPR compliant doesn’t mean you’re CPRA compliant. But it does mean you’re generally pointed in the right direction. And that’s a good thing.

1 Automated decision-making is when humans are eliminated from the decision-making process. AI and machine learning (ML) technologies and techniques are used to for example, to model pricing and/or content that consumers might get in an ad. On the employee side, it might be used to segregate candidates applying for a job or those employees who qualify for promotions. See Privacy Law Impacts to AI and High-Risk Processing for additional insights.

  • Privacy Law Update

Privacy Law Update: September 19, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


Newsworthy Updates

U.S. Chamber Testimony at FTC Data Privacy Open Forum

Washington, D.C. – Jordan Crenshaw, Vice President, U.S. Chamber Technology Engagement Center, testified before the Federal Trade Commission’s (FTC) data privacy rulemaking open forum.

FTC Privacy Rulemaking Forum Brings Industry, Advocate Views To The Forefront

The U.S. Federal Trade Commission said it would follow the letter of the law when it announced its Advance Notice of Proposed Rulemaking concerning commercial surveillance and lax data security in August, which meant a robust stakeholder consultation to come. That process began in earnest with an exchange of perspectives from relevant parties at the FTC’s virtual public forum on Sept. 8.

How Does Data Governance Affect Data Security And Privacy?

While it’s important to implement processes and procedures that safeguard data security and privacy, you can also focus on more strategic data governance goals.

Privacy Legislation

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • CCPA & CPRA
  • Regulations

California Fails to Extend Employee & B2B Data Exemptions

The California Legislature adjourned its 2022 legislative session on August 31, 2022 failing to pass legislation that would have extended exemptions under the CCPA applicable to personal information in employee, human resources and business-to-business contexts.

The exemption expires under the CPRA effective on January 1, 2023.  The exemptions that will expire include the personal information of job applicants, employees, owners, directors, officers, and independent contractors in the context of an individual’s employment or application for employment, and to personal information reflecting written and verbal communications where a consumer is acting in a business-to-business commercial transaction. They also apply to personal information collected by a business for emergency contact information and personal information necessary for a business to retain and administer employee benefits, provided the information is used only for those purposes.

Covered businesses will need to take a close look at their privacy programs to ensure they comply with CPRA, particularly as it relates to the removal of these exemptions.

  • Know where your data is: Map and inventory data across all systems, assets and processing activities that collect and process employee and business-to-business personal information.
  • Update your Data Subject Rights (DSAR) portal: Additional functionality and work flows will need to be created to process workforce  subject rights.
  • Be transparent: CPRA has taken an acute position of what needs to be included in your privacy notice including:  the categories of personal information a business collects, the purpose information is used, and what privacy rights consumers have.  Remember under CPRA, your workforce is considered ‘consumers’.
  • Understand if you sell/share or process sensitive personal information: If you do, disclose it in your privacy notice and provide all consumers including employees and workforces members a clear and conspicuous way to opt-out or limit the use of that information.
  • Update service provider and contractor agreements: CPRA requires that data processing agreements are in place with all service providers, contractors, and other third parties that process covered employee or B2B personal information

Processing employee access requests will likely present new challenges.  Personal information about other employees may be exposed, requests may be coming from disgruntled employees, the information requested might be related to litigation, and data will need to be redacted.  All of this may cause an undue burden on businesses.

WireWheel offers a complete solution to help manage the requirements of CPRA, including a solution to fulfill employee DSARs, including an integration with Microsoft Priva and connectors to over 500 plus systems including HR systems such as Workday and Oracle. Contact us to learn more.

  • Company
  • Privacy Tech

Understanding Vendor Assessments

What is a Vendor Assessment?

Organizations use a Vendor Assessment to evaluate how a third-party vendor manages the personally identifiable information (PII) the company shares with that organization. The assessment is used to understand whether or not vendors are implementing and maintaining appropriate security and privacy controls.

A Vendor Assessment is typically used as part of an overall program that establishes guidelines to ensure that an organization’s vendors comply with that organization’s required information security policies and procedures. Organizations will seek a security and privacy review of active and potential vendors, and vendors must demonstrate that they have practices in place to securely manage data.

What are the benefits of doing Vendor Assessments?

Third-party vendors are a high risk area for privacy breaches and pose potential risks in improper usage and sharing the protection of confidential personal information. Organizations provide third-party providers with personal information in a variety of manners. Some do so intentionally by specifically giving the information to the third-party. Others do so in effect by placing the vendor in a situation where they have access to the information. Organizations may also share information unintentionally with a vendor.

Vendor Assessments are used to understand a vendor’s business continuity plans, regulatory compliance, and data security safeguards. After collecting and analyzing survey responses, an organization can determine the level of risk involved in giving vendors or third parties access to that organization’s internal systems and data. Collecting this information from vendors enables an organization to ensure that its vendors are consistently compliant with required security policies and procedures.

How does WireWheel support Vendor Assessments?

Using WireWheel’s Privacy Operations Manager, organizations can conduct multiple vendor assessments. The platform maintains a record of all the assessments completed for the organization. Vendor Assessments can be triggered manually or through automation.

For organizations that have several vendors, automating Vendor Assessments can save time and effort, resulting in better resourcing and reduced costs. One way this can be achieved is by automatically initiating a follow-up assessment based on a response submitted during an internal privacy impact assessments. When someone confirms involvement of third party vendors, an vendor assessment can be triggered.

WireWheel also provides organizations the ability to initiate a follow-up assessment within a template based on the vendor data collected in an internal privacy impact assessments. The privacy impact assessment captures  the vendor information and WireWheels uses that data to auto trigger vendor assessments especially multiple ones at the same time.

The follow-up vendor assessments get automatically triggered and assigned to the vendor(s) whose details are provided in the privacy impact assessments.

Common Types of Vendor Assessments

Wirewheel has designed templates for standard vendor assessments based on requirements from privacy laws including the EU’s GDPR, California’s CCPA/CPRA:

Third Party Assessment – This assessment is designed to conduct a security review of active and potential vendors. Vendors must demonstrate that they have practices in place to securely manage data. This enables the organization to ensure that its vendors are consistently compliant with required security policies and procedures.

Data Seller Vendor Assessment – This assessment is meant to ensure compliance by data sellers who offer their own collected first party data for purchase or aggregate first party data from other companies. This enables an organization to make sure that the data they are selling is going to be used properly by the people they are selling the data to. The questions in this reseller assessment are based on leading industry frameworks and legislation including NIST, ISO27001, IAB TCF, GDPR, California’s CCPA and CPRA, Colorado’s CPA, Virginia’s CDPA and Utah’s UCPA. The focus of this review is to ensure the legal, ethical, and secure use of data for marketing purposes.

High Risk Assessment – This assessment is designed to ensure that sensitive personal information is properly stored and protected.  California’s CPRA, Colorado’s CPA, and Virginia’s CDPA require assessments for data processing activities that “present a heightened risk of harm to a consumer”.  These include processing personal data for targeted advertising, sale/share profiling, and the use of sensitive data.

Summary

With more and more regulations requiring Vendor Assessments, leveraging a tool like WireWheel’s Privacy Operations Manager can help companies ensure that companies are handling personal information properly.

  • Company
  • Privacy Tech

Understanding Privacy Program Reporting

Privacy programs rely on reports to demonstrate compliance and to monitor the status of their program. The WireWheel platform offers several reports for programs to use to monitor Data Subject Access Requests and privacy operations.

Data Subject Access Rights (DSAR) Reporting

The California Consumer Privacy Act (CCPA) requires businesses subject to the regulation to post their consumer request metrics. These reporting obligations, outlined in Section 999.317(g) of the CCPA, apply to any business that is subject to the CCPA and that knows or reasonably should know that it, alone or in combination, buys, receives for the business’s commercial purposes, sells, or shares for commercial purposes the personal information of 10,000,000 or more California residents in a calendar year.

Businesses subject to the reporting obligations must disclose by July 1 of every calendar year, either in the Privacy Policy or elsewhere online with a link in their Privacy Policy, the following metrics for the previous calendar year:

  • The number of requests to know that the business received, complied with in whole or in part, and denied;
  • The number of requests to delete that the business received, complied with in whole or in part, and denied;
  • The number of requests to opt-out that the business received, complied with in whole or in part, and denied;
  • The median or mean number of days within which the business substantively responded to requests to know, requests to delete, and requests to opt-out.

WireWheel – DSAR Reporting

With WireWheel’s Trust Access and Consent Center (TAC), customers can export DSAR metrics including Request Type, Request Status and Due Date.

Users with appropriate permissions can generate reports as follows:

  • A summary of DSAR metrics for the last year in a simple CSV format.
  • Abandoned and Failed DSAR Report – A CSV summary of all Data Subject Access Requests that were begun, but not fully submitted
  • DSAR Summary Report – A CSV summary of all Data Subject Access Requests that were begun, regardless of whether or not they were fully submitted.

WireWheel – Sample DSAR Report

Privacy Impact Assessment Reporting

Record Of Processing Activities (ROPA)

WireWheel’s Privacy Operations Manager (POM) enables companies to create a Record of Processing Activity or ROPA. A Record of Processing Activities (ROPA) is a record of an organization’s processing activities involving personal data. Some businesses may think of “processing” as being limited to active events, but a ROPA must also cover data that sits on a server or a shelf. A ROPA includes the following information for each processing activity:

  • Names and contact details of the data controller, data processor, data controller’s representative, joint controller, and data protection officer (DPO), if applicable
  • Purpose (i.e., lawful basis) of processing personal data
  • Categories of data subjects and categories of personal data being processed
  • Categories of recipients to whom the personal data has been or will be disclosed
  • Third parties in other countries or international organizations who receive the personal data
  • Retention schedule for each category of personal data
  • General description of technical and organizational security measures related to each processing activity

A completed ROPA lists each processing activity involving personal data and provides detailed information about each of the items listed above.

Why is Record Of Processing Activities (ROPA) Required?

In 2018, companies were first introduced to the concept of a ROPA because of the General Data Protection Regulation (GDPR). Article 30, on Processing Record keeping. Article 30 requires companies to keep a detailed record of all activities related to the processing of personal data, also known as a Record of Processing Activities (ROPA).

Currently U.S. data privacy laws have not directly adopted a provision comparable to Article 30. However, laws like CCPA require an entity to retain records on consumer requests. The Federal Information Security Management Act (FISM) contains a data retention requirement that directs government agencies to archive records on categories of data and certain processing activities.

Benefits of ROPAs include:

  • Providing organizations with a close look at their data processes from an enterprise-wide perspective
  • Identifying redundancies by detailing cases of the same types of data being saved and updated in different locations at different times, which can make it impossible to identify which records are the most current, complete, and accurate.
  • Helping organizations identify where the category of the data is located and how it’s being processed thus enabling the organization to respond to data subject requests promptly and accurately.
  • Thinking strategically about data retention schedules and implementing time limits allows the organization to control “data swell” and better leverage its data as a strategic asset. This helps organizations to plan for data retention.

Through the process of data discovery, some organizations realize they have been collecting certain categories of personal data that serve no specific purpose, and the ROPA can serve to validate that data being acquired actually has business value thus streamlining data collection.

WireWheel – ROPA

WireWheel provides customers the ability to customize the ROPA and include only the required information from assessments. ROPAs are mapped to a template which is essentially the blueprint of the assessments. The ROPA is designed to give organizations a single source for answers to key questions about the personal data in the organization: what, who, why, where, when, and how.

Other Reports

WireWheel’s Privacy Operations Manager also provides users with the relevant permissions, the ability to export reports. These reports are used to understand the system process, changes over a period of time, efficiency of the assessments and so on.

The WireWheel platform includes:

  • Assessment Summary Report – A summary of the selected assessments in a simple CSV format.
  • Assessments by Business Process – Download a CSV (Excel compatible) of a tabular listing of all your Assessments grouped by Business Process.
  • User History Over Time – Download a CSV file (Excel compatible) of the number of users by month, segmented by user types.
  • Users by Assessment – Download a CSV (Excel compatible) of all your Assessments segmented by users assigned to those Assessments.

WireWheel – Sample POM Report

Summary

WireWheel’s Trust Access and Consent Center and Privacy Operations Manager give clients the reports they need to measure and track their program.

  • Privacy Law Update

Privacy Law Update: September 12, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


Newsworthy Updates

FTC Privacy Rulemaking Forum Brings Industry, Advocate Views to the Forefront

The U.S. Federal Trade Commission said it would follow the letter of the law when it announced its Advance Notice of Proposed Rulemaking concerning commercial surveillance and lax data security in August, which meant a robust stakeholder consultation to come. That process began in earnest with an exchange of perspectives from relevant parties at the FTC’s virtual public forum Sept. 8.

A View From Brussels: The Latest on UK Data Protection Reform and Recommendations on the AI Act

Liz Truss has succeeded Boris Johnson as the U.K.’s next prime minister. Truss previously served as Trade and as Foreign Affairs Minister in Johnson’s government. Truss has appointed Michelle Donelan as Secretary of State for Digital, Culture, Media and Sport. In this role, Donelan and her ministerial team will oversee U.K. data protection reform initiated by DCMS a year ago. However, its fate is uncertain at the moment as further discussions were postponed.

ICO Publishes Guidance on Privacy Enhancing Technologies

The Information Commissioner’s Office (ICO) has published draft guidance on privacy-enhancing technologies (PETs) to help organizations unlock the potential of data by putting a data protection by design approach into practice. PETs are technologies that can help organizations share and use people’s data responsibly, lawfully, and securely, including by minimizing the amount of data used and by encrypting or anonymising personal information. They are already used by financial organizations when investigating money laundering, for example, and by the healthcare sector to provide better health outcomes and services to the public.

CCPA/CPRA Grace Period for HR and B2B Ends Jan. 1

On Aug. 31, hopes were dashed when the California legislative session ended without enacting Assembly Bill 1102. The bill would have extended grace periods for certain business-to-business and human resources personal information under the California Consumer Privacy Act as amended by the California Privacy Rights Act. CCPA/CPRA will become fully operational on Jan. 1, 2023, for B2B and HR personal information and will be subject to the same rigorous California privacy regulations as “consumer” personal information.

Privacy Legislation

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy

6 Tips to Draft And Incorporate a Privacy Policy For Your Business

If you collect customer personal data, then you need to protect that data. No matter the field you’re in, or the size of your business, once they hand it over it’s your responsibility. And how you deal with it affects the trust your customers will have in you.

 

Chart on consumer trust for companies that have their personal data
Image sourced from mckinsey.com

 

Whether you’re gathering basic info such as name and address or more sensitive details like banking or credit card information, you need to be able to disclose what you do with it and demonstrate that you have taken steps to protect it.

The first step to this is creating a privacy policy. That can take different forms according to your business type and the information you store, but at its heart it demonstrates to your customers that their data is safe in your hands.

Wondering where to start? Just as you might use a letter of intent template, think of these tips as a way to build a template for your privacy policy.

What is a privacy policy?

As the term suggests, a privacy policy is a written policy that lets your customers know how you collect their data, what that data is, what it is used for, who it will be shared with, what their rights are, and how you will both store it and protect it. It should always be included on your website and at the point of collection on Apps and may be labeled as a privacy notice or in a section called simply ‘privacy’.

One thing to note is that there are now many regulations and laws that govern how you protect data, including laws such as HIPAA in the US or GDPR in the EU. Make sure you’re aware of any required compliance in any region you operate in, not just where your head office is based.

Do you really need a privacy policy?

a chart showing concern over data privacy in online ad targeting
Image sourced from marketingcharts.com

 

In nearly every case, yes. If you collect any sort of personal data from your customers, then you not only need an easy to understand privacy policy, clearly displayed so that customers can find it. Ensure that it is updated regularly and rigorously enforced.

The reality is that many people may not read your privacy policy in full. However, knowing you have one and knowing there are privacy laws and regulations governing how you handle data is usually enough to satisfy most people. That doesn’t mean you should ‘skimp’ on details when publishing such a policy; it should be clearly written and in language people can understand. Remember, if you say it in your privacy policy you must do it in practice.

6 top tips to drafting a privacy policy

While some companies’ privacy policies may differ slightly and contain specialized clauses, there are general commonalities that most companies should work from. You wouldn’t draft a shareholder agreement without using something like a PandaDoc shareholders’ agreement template, so don’t start from scratch here either. Instead, make sure to incorporate the following:

1. Introduction

This is a section every privacy policy should have. You want to inform customers who you are as a company and why you need this privacy policy. It should be a fairly short section but should include the following info:

  • Your company name and contact information
  • Any laws and regulations (and the applicable regions) that your policy complies with
  • Glossary of any main terms used such as ‘personal data’
  • Who the policy applies to, who ‘we’ and ‘you’ refers to, and identification of any third parties that are included in your policy.
  • When the policy was last updated

2. Data information

This is perhaps the most crucial part of your privacy policy. It should include:

  • What data you collect
  • How you collect it
  • How you will use it
  • If you share or sell that data
  • How you will store it (for example, do you use data integration software?) and protect it
  • When you may transfer that data to others.
  • How to exercise your privacy rights
  • Data retention practices
  • Where applicable how data is transferred outside of the EEU
  • Where applicable how children’s data is used
  • How updates to the policy are communicated
  • Where applicable State specific language (i.e. CPRA)

Customers will focus on this part as it gives details on how you use their personal data. To avoid confusion, you should split this part into sections that deal with different aspects of how you collect, store, and use data.

Here is a sample privacy policy from Apple:

Example of Apple's privacy policy

2a. Data collection

This section lets the customer know what data you collect and where you collect it from. The latter part can include webforms or from the checkout process. Some examples of the data you might collect includes:

  • Name, email address, and physical address
  • Phone number
  • Age and sex or gender.
  • Nationality and race
  • Login information
  • Financial info such as credit card details
  • IP address and browser or device type

What you also have to consider is that some of the data you collect may come from third party services such as Google Analytics. When that is the case, you should advise customers and direct them to that third party’s own privacy policies as well.

2b. Data use

Your next section should inform people as to how you will use their data. This may be a requirement under certain laws but it is something you should be telling people anyway. Of course, there are many different ways you might use a customer’s info but some of the most common are:

  • Security and identity verification
  • To target advertising according to tastes and previous behavior
  • Sending relevant and personalized marketing
  • Customer service and/or tech support reasons
  • Delivery of products and/or services

2c. Data sharing

You may be sharing some customer data with third parties or partners. If this is the case, then you need to let the customer know who you might share with, how it will be shared, and why you are sharing it. This may have already been covered under the ‘third party’ section of your introduction. You should also advise here when you have to share info with government departments or similar.

2d. Data sales

Thankfully, this is mostly a thing of the past and in most locations, you can’t sell on any customer data. However, some areas – such as California – still allow the selling of such info so you should advise people of this, who it might be sold to and, most importantly, give them the choice of opting out of their data being sold. If you have no intention of selling their data, make that clear.

3. Data retention and deletion

People also want to know how long you plan on keeping their data in your system. So, you should advise them if you have set time limits on data retention or whether there are legal limits on how long you can keep it. You should also include what happens at the end of the process; will their info be completely deleted or will it be anonymized.

4.  Children

Parents (rightfully!) worry whether data will be collected on their children. The legal definition of children can vary from area to area, so be guided by the relevant legislation. Most businesses will not collect information about children so you should make that clear and have a disclaimer included in your policy.

5. Personal rights

This is another factor that differs from region to region so be sure you are aware of the different laws and regulations that apply where you operate. Let your customers know what rights they have in the area they live in and how they can apply those rights if they want to see what info you hold on them.

If you want them to give up certain rights, for instance, via an NDA, it’s worth looking if you can find a non-disclosure agreement template available for free. That way, you have somewhere to start from.

6. Changes and complaints

Sometimes, changes are unavoidable. New data privacy regulations may be introduced, or existing ones may be updated. Indeed, you may decide to use customer data in a different way. It’s important to include how you will update people in that scenario. It’s also crucial that you highlight your complaints process and include all relevant contact information.

The takeaway

Free to use image sourced from Pixabay

Every business has multiple things to consider, from inventory management development to automation of email marketing. However, no matter what type of business you operate, some form of privacy policy is essential.

The important thing to remember is that you may have to comply with many different laws and regulations according to the territories you operate in. This means that while you may have a general policy on privacy, some parts of that policy have to acknowledge those differences and inform your customers what they are.


Yauhen is the Director of Demand Generation at PandaDoc. He’s been a marketer for 10+ years, and for the last five years, he’s been entirely focused on the electronic signature, proposal, and document management markets like Pandadoc sponsorship proposal template. Yauhen has experience speaking at niche conferences where he enjoys sharing his expertise with other curious marketers. And in his spare time, he is an avid fisherman and takes nearly 20 fishing trips every year.

  • Company
  • Privacy Tech

PIAs and Reassessments in WireWheel

What is a Privacy Impact Assessment?

A Privacy Impact Assessment (PIA) is an assessment or questionnaire that collects information on how personally identifiable information (PII) is collected, stored, protected, shared and managed.

A PIA is usually designed in a survey format and, at the very minimum, should answer the following questions:

  • What and how information is collected?
  • Why is the information collected?
  • What is the intended use of the information?
  • Who will have access to the information?
  • With whom will the information be shared?
  • What safeguards are used to protect the information?
  • For how long will the data be retained/stored?
  • How will the data be decommissioned and disposed of?
  • Have Rules of Behavior for administrators of the data been established?

The PIA should be completed, reviewed, and the records should be maintained for reference.

Why are Privacy Impact Assessments needed?

Privacy Impact Assessments are required under several privacy laws passed over the last 20+ years. PIAs are seeing an increase in momentum as privacy legislation has gained traction and the requirements have expanded.

  • The E-Government Act of 2002, Section 208, establishes the requirement for agencies to conduct PIAs for electronic information systems and collections.
  • The instrument for a PIA or data protection impact assessment (DPIA) was introduced with the General Data Protection Regulation (Art. 35 of the GDPR).
  • Starting in 2023, some US State Privacy laws, including laws in California, Colorado, Virginia and Connecticut, will require PIAs for vendor assessments and for high-risk data processing activities including laws

The EU’s GDPR requires a Digital Privacy Impact Assessment (DPIA) must be conducted when the processing could result in a high risk to the rights and freedoms of natural persons.

  • A DPIA is a type of risk assessment. It helps you identify and minimize risks relating to personal data processing activities. DPIAs are also sometimes known as PIAs (privacy impact assessments). We have had a few clients who conduct compact PIAs and if a high-risk system is identified then they trigger a DPIA or High-Risk Assessment.
  • The EU GDPR (General Data Protection Regulation) and DPA (Data Protection Act) 2018 require you to carry out a DPIA before certain types of processing. This ensures that you can mitigate data protection risks.
  • If processing personal information is likely to result in a high risk to data subjects’ rights and freedoms, you should carry out a DPIA.
  • For example scoring/profiling, automatic decisions which lead to legal consequences for those impacted, systematic monitoring, processing of special personal data, data that is processed on a large scale, the merging or combining of data that was gathered by various processes, data about incapacitated persons or those with limited ability to act, use of newer technologies or biometric procedures, data transfer to countries outside the EU/EEC and data processing which hinders those involved in exercising their rights. However, if several criteria are met, the risk for the data subjects is expected to be high and a data protection impact assessment is always required.

How to get started with your Privacy Impact Assessment

Many companies start out using spreadsheets as a way to collect the information required for a PIA. However, they find that it can be very difficult to track and manage these assessments without a tool.

Leveraging deep privacy expertise, WireWheel has developed a software solution to help companies manage assessment, the WireWheel Privacy Operations Management (POM) platform. The tool helps companies to easily design and conduct assessments.

Users create a template, which is a list of questions that need to be answered. The template is then used to kick off multiple assessments. Templates can be structured to include questions to understand whether or not the collection and use of personal data are in compliance with data protection regulations. This information can be mapped to asset inventories.

WireWheel has standard templates that cover key regulations and requirements and also helps to build custom templates to suit a client’s specific requirements.

The WireWheel Privacy Operations platform enables users to manually trigger assessments or for a vendor to self-initiate an assessment if required. Once an assessment is triggered, a user can assign the questions to vendors or suppliers or system owners to answer. The responses are reviewed and approved by the assessment owner and the platform ensures that the detailed assessment responses are recorded so a company can prove compliance if audited.

OR

What happens after the PIA is complete?

Once the PIA is completed and documented, a company will typically set criteria to trigger another PIA or a reassessment.

Typically this happens when any of the following activities occur:

  • Developing, or procuring any new technologies or systems that handle or collect personal information
  • Developing system revisions; when substantial changes are introduced to an existing data processing system
  • Issuing a new or updated rulemaking that affects personal information.
  • When an existing data processing system is involved in a major data breach or recurring security incidents
  • When it is according to a predetermined schedule

According to the EU’s GDPR, the reassessment process must be repeated at least every three years.

How does WireWheel help with reassessments?

WireWheel maintains a record of all completed assessments and enables customers to determine the need for reassessments using product features like reporting, tag management, or assessment details like “Last completed” date and so on. Based on certain criteria like high-risk scores, data breach alerts, or the last completed assessment, the privacy/legal team can identify the need for a reassessment and initiate it using the previously submitted assessment.

Reassessments can be triggered in the WireWheel platform by any team or individual with the appropriate permissions. Clients start with the creation of a copy of the completed assessment so that the responses submitted previously will be automatically pre-populated. The reassessment will use the latest, published version of the same template that was used to create the original assessment and use the review workflow that the original assessment used.

A completed assessment at WireWheel will include the responses submitted by the assignee, assignee(s) information, completion timestamp, and tags if any.

In WireWheel, a copy of the completed assessment can be created by navigating to “Create a Copy”:

The newly created reassessment provides the ability for the owner to assign all the questions or just the relevant questions to the assignee(s) for updates. The responses previously submitted by the assignee will be pre-populated and available for the assignee to review and edit.

Once the assignee updates and submits the reassessment, the owner reviews and approves the responses. The reassessment is then saved as a new record with the latest responses submitted by the assignee, assignee(s) information, new completion timestamp, and tags if any.

How do you compare one assessment to another?

WireWheel provides users the capability to compare the responses in an assessment using the default reporting feature. The platform allows the users to select the relevant assessments that they want to be included in the report and reports can be downloaded to the individual user’s system as well.

Summary

With more and more regulations requiring Privacy Impact Assessments, leveraging a tool like WireWheel’s Privacy Operations Manager can help companies ensure that companies are handling personal information properly.

  • Privacy Law Update

Privacy Law Update: September 6, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


Newsworthy Updates

FTC Sets Final Agenda For Sept. 8 Rulemaking Forum

The Federal Trade Commission (FTC) released the final agenda for its September 8, 2022 forum seeking public comment on the harms stemming from commercial surveillance and lax data security practices and whether new rules are needed to protect people’s privacy and information. The FTC recently announced an Advance Notice of Proposed Rulemaking (ANPR) seeking public comment as it explores possible new rules cracking down on lax data security and commercial surveillance, which is the business of collecting, analyzing, and profiting from information about people.

Why FTC Rulemaking Pales In Comparison To Proposed American Data Privacy And Protection Act

In a piece for Lawfare, Brookings Institution Tisch Distinguished Visiting Fellow Cameron Kerry discusses how proposed U.S. Federal Trade Commission privacy rulemaking is motivation, not a substitute, for passing the proposed American Data Privacy and Protection Act (ADPPA). Kerry breaks down the “long, tortuous road” that is the FTC rulemaking process while also explaining FTC commissioners “are conscious that the ADPPA addresses many of the issues raised,” with all five commissioners happy to defer to the ADPPA if U.S. Congress can finalize it before rulemaking is complete.

Working Group Takes On NIST Privacy Framework Update

In the modern history of the privacy profession, the one constant dynamic has been the rapid development of new technologies has reflexively created an industry that is in a perpetual state of innovation.  To better anticipate what role privacy will play in commerce in the near-to-long-term, the U.S. National Institute of Standards and Technology embarked on developing the new “Privacy Framework” document.

Meta’s Facebook Agrees To Settle Cambridge Analytica Data Privacy Suit

Meta’s Facebook settled a long-running lawsuit in a U.S. court seeking damages for letting third parties, including Cambridge Analytica, access the private data of users.  The preliminary settlement was disclosed in a court filing late Friday. The terms were not disclosed.  Lawyers for the plaintiffs and Facebook asked the judge to put the lawsuit on hold for 60 days to allow the parties to “finalize a written settlement agreement” and present it for preliminary approval by the court.

Is Data Localization Coming To Europe?

Two years ago, the Court of Justice of the European Union invalidated Privacy Shield, the legal framework for EU-U.S. data flows. The consequences of that ruling reinforce the EU’s digital sovereignty agenda, which increasingly sees data localization as one of its core elements.  Since the “Schrems II” judgment by the CJEU, the U.S. presidential administration and European Commission have been working on replacing the trans-Atlantic agreement with a new one that could stand judicial review before Europe’s top court. In March 2022, U.S. President Joe Biden and Commission President Ursula von der Leyen announced an agreement in principle.

Privacy Legislation

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • CCPA & CPRA
  • Regulations

The California Attorney General’s First Enforcement of CCPA

On August 24, 2022, California Attorney General Rob Bonta made his first announcement of CCPA enforcement, issuing a $1.2 million dollar settlement with online retailer Sephora, Inc. Attorney General Bonta is sending a strong message that he intends to aggressively enforce the CCPA and pending CPRA.  If you are a business that shares information with advertising networks you need to be sure you are fully compliant under California privacy laws.

Sephora allegedly violated the CCPA by failing to meet many of the key requirements including:

  • Informing consumers they sell their personal information
  • Properly honoring Global Privacy Controls (GPCs) for opt-out of sale requests
  • Disclosing their sell/share practices in their privacy policy
  • Curing the alleged violations within the 30-day period

The focus of this enforcement centered on Sephora’s sharing personal information with third-party advertising networks.  The failure to cure led to a broader investigation of Sephora’s privacy practices.

In addition to a monetary fine, the settlement also imposes injunctive obligations on Sephora including:

  • Implement and maintain a program to assess and monitor the effectiveness of  processing opt-outs within 180 days of the settlement for the following two years
  • Conduct annual reviews of their websites and mobile applications to determine which companies they sell/share data with within 180 days of the settlement for the following two years
  • Share the assessment and reviews in an annual report detailing the:
    • Testing done
    • Errors or technical problems impacting the processing opt-out requests
    • Companies they sell/share personal information with
    • Whether the companies are considered service providers
    • Purpose for which information is made available
  • Update disclosures and privacy policy to make it clear that it sells data
  • Provide mechanisms for consumer opt-outs including GPC
  • Align service provider agreements to the CCPA’s requirements

Here’s a few things to consider to make sure that you are compliant:

  • Sale:  Based on this enforcement action, leveraging cookies and tracking technologies and sharing with advertisers is likely a sale and therefore:
  • Opt-out:  You must allow consumers  to opt-out of cookies.  Make sure your website can accommodate  “Do Not Sell My Personal Information” requests and that GPC or similar technologies are enabled to handle those requests
  • Manage Consent and Preferences:  Properly collect consent and preference signals and ensure they are shared with third parties across your ecosystem
  • Privacy Rights:  Provide easy access for consumers to exercise their privacy rights
  • Contract Service Providers:  Make sure that you properly identify your service providers and update your contracts so they are compliant with CCPA and CPRA
  • Right to Cure:  Leverage this while it’s available as the cure period sunsets 1/1/23.  Sephora may have avoided this outcome had they reacted earlier.
  • Privacy Notices:  Review your privacy policies on your websites, apps and point of collection.  Align them with CCPA and CPRA requirements.  Specifically disclose “sale” of personal information for advertising and analytics purposes and how to opt-out, and any activities that may be considered a “financial incentive”

WireWheel’s software solutions help you comply with privacy regulations including managing privacy rights and consents and preferences.

  • Privacy
  • Regulations

Privacy Predictions for 2023

Yogi Berra once warned that “it’s difficult to make predictions, especially about the future.” Proving his point, Cooley Partner, Travis Leblanc confesses that at last year’s Spokes Conference he called it right just 33% of the time.

Uncowed, the closing session of the 2022 Summer Spokes Technology Conference (held June 22-23) offered some near-term privacy predictions once again. But more than just making crystal ball gazing, the Privacy Predictions for 2023 roundtable provides deep insights into the challenges (political and technological) in advancing privacy around the world.

The roundtable, hosted by WireWheel founder and CEO Justin Antonipillai Included:

  • Travis Leblanc, Cooley’s Cyber Data Privacy Practice Co-Leader who is also a member of the Privacy and Civil Liberties Oversight Board (PCLOB) that oversees the privacy and civil liberties practices of the intelligence community
  • The widely read Gabriela Zafor-Forutna, Vice President for Global Privacy at the Future of Privacy Forum (FPF), and
  • Omer Tene, Partner in Goodwin’s Cyber Security Practice who is also a senior fellow at the FPF. Tene founded the Cyber Week held in Tel Aviv.

The next 18 months around the world

“I am absolutely watching India,” says Zafor-Forutna. “For example, there is a fair chance that we will finally see the personal data protection deal pass after three-plus years of debate. This would bring more than one-billion people within the scope of personal data prediction rights.

A very interesting number that I’ve seen recently from a Gartner report was that 75% of the global population in the next three to five years will be covered by some form of privacy rights.

—Gabriela Zafor-Forutna, FPF

“Southeast Asia is a region that’s also very active in this space. I would urge folks to also keep an eye on Indonesia. Australia has had a couple of public consultations on the privacy law and also the data security regime, so we might see some action there as well.

“And Canada just last week published a comprehensive bill that covers both federal privacy law and provisions related to AI and data generally quite similar to the EU Data Act in the EU.“Here in the U.S., I’m hoping to see the successor to the Privacy Shield resulting from the U.S.-EC negotiations.

“Here in the U.S., I’m hoping to see the successor to the Privacy Shield resulting from the U.S.-EC negotiations.”

The ADPPA and Privacy Shield

Tene, with a bit of tongue in cheek, offers that we will see the EU and U.S. negotiate a successor to the new transatlantic data privacy framework after it too is struck down…by Schrems III.

Leblanc, however, predicts we will see a privacy shield replacement – and adequacy decision from the EC with all relevant approvals – in the next 6 to 12 months.

I’ll also put on the calendar that there will be heavy debate in the U.S. regarding Section 702 of the Foreign Intelligence Surveillance Act reauthorization  at the end of next year, concerning the extent to which it should be expanded potentially, or possibly even ended. 702 is at the center of the CJEU discussions concerning cross-border data transfers.


“The American Data Privacy and Protection Act (ADPPA), which is a ‘three corners bill,’ having the support of Republicans and Democrats in the House and Senate Republicans, is waiting for the ‘fourth corner’ – Senate Democrats — to rally behind it,” notes Tene.

The bill was introduced formally (21 June 2022), it even has a number now: H.R. 8152…you can’t overstate how big a deal this is, opines Tene. “It is broad, deep, and includes innovative concepts that we have yet to see anywhere in the world.

—Omer Tene, Goodwin

He further suggests that “CCPA/CPRA would basically be gone, except strangely (and ironically) for the provisions protecting employee data.”

Is Tene predicting it’s going to pass, then? “No, I’m predicting the Phoenix Suns will win the championship next time. I refuse to be bullied into making predictions.”

“I don’t want to be the naysayer here on the ADPPA but earlier today (3 June 2022) , Senator Cantwell made clear that she’s opposed to it and Senate Leader Schumer has said, there is no way that bill is going to be taken up in the Senate this congress, advises Leblanc.

At this point, there is a quickly closing window on the opportunity to actually consider any legislation – including the ADPPA – in the Senate because we are in an election year…and a third of the Senate (post August recess) begins to focus their attention on the November elections.

—Travis Leblanc, Cooley

And “if the Senate does flip from Democrat to Republican there’s going to be a mad rush to push through several confirmations and key priorities,” he continues. “There is a challenge, practically speaking, for floor time, even if it had the support of the Chair of the Commerce Committee, which it doesn’t.

The FTC and the States

On the regulatory front, Leblanc notes that “the Federal Trade Commission now has a fifth Commissioner(Alvaro Bedoya) giving Chair Khan a majority. We expect they will begin a process to promulgate privacy rules around privacy and security, including perhaps, updating COPPA.

“In addition to the FTC, I expect we’ll see some activity at the SEC which has advanced two rule makings related to cyber security. The one that’s getting the most attention is around the disclosure and governance controls associated with public companies in the U.S.

I do anticipate that a lot of activity at the state level as well. Assuming no preemption [there are] the CPRA regs that were recently voted on by the new California DPA covering issues from dark patterns to contractual requirements which are now in draft form and expected to be finalized later this year.

If you do business in California, I strongly encourage that you take a look at those and begin a process soon of coming into compliance with them.

—Travis Leblanc, Cooley

Colorado Attorney General Phil Weiser (who spoke with Justin) is also looking at regulations concerning issues like dark patterns as well. So, I predict will see something from Colorado in the next 18 months, says Leblanc.

On the tech front, Antonipillai predicts significant investment in Web 3.0. He also predicts less investment in n cryptocurrencies, but more in blockchain.

He further predicts that in the next 18 months we will see advances in how sensitive data can be shared and controlled for things like medical information with investments in this area driving critical innovation (see here and here).

The hot really topic, however, is artificial intelligence (AI).

I predict there will be at least one major step forward in AI in the next 18 months that causes all of us to feel like some version of it is almost sentient. We’re going to see technology that’s driven by AI almost mimicking the level of human thought and making it harder to even think about it from a regulation perspective.

A lot of regulation is trying to address transparency and understanding the way neural networks work. I predict we’re going to have steps forward in AI that makes it very hard to think about how you apply a law to it.

—Justin Antonipillai, WireWheel

“Legislators are trying to regulate the conduit of those that are building these systems,” says Zafor-Forutna. “For example, the EU tries to put in place some rules for providers of AI systems, but how much those rules will help, we don’t know. Perhaps the prediction for the next 12 to 18 months is that we all become a bit more literate in understanding the different shades of AI and machine learning.”

“There’s a lot of policy activity around AI in the U.S.,” says Tene. State laws have provisions concerning automated decision making and the ADPPA also has a very interesting regime around AI including requirements for businesses to do ‘algorithm impact assessments.”


Interestingly notes Zafor-Forutna, Brazil could be the first jurisdiction to adopt a comprehensive framework around AI.” There’s a proposal going through the congress on AI law right now. She also notes that Singapore, in a different approach, is looking to take advantage of existing regulations.

The challenges posed by Blockchain

“There is a fundamental tension in my view, between blockchain and some fundamental rights In Europe such as the right to be forgotten or the right not to have your data transferred,” avers Antonipillai.

I think there’s an even more fundamental tension, which is GDPR relies upon the assumption that a natural or legal person (a data subject) can enforce their rights which then relies on the assumption that they’re established in the EU. When you’re dealing with a distributed ledger or blockchain technologies you may not know.

—Travis Leblanc, Cooley

Perhaps this is an example of how technology like AI and blockchain are outpacing the regulatory systems that we set up offers Leblanc.

“It’s no surprise that regulations which were not adept to deal with the Internet, struggle with even newer technologies like the blockchain,” says Omar. “And the tension isn’t just with privacy law, it’s with other laws as well such as copyright or horrible stuff like child pornography which once on the ledger, can’t be deleted.

“There are some technological fixes to it, but I do agree with the premise that it’s difficult to stay on pace with technological development.”

But “the Groundhog Day for privacy professionals – the primary issue we deal with – is adtech digital marketing which is obviously under intense regulatory pressure all over the world. If the ADPPA passes, it has very strict limits on advertising technologies. And, of course, there is CPRA and Colorado.”

“If people can easily opt out in on place, others will be expected to do it, and that will significantly change the dynamics of the market.”

  • Marketing
  • Privacy Tech

How To Think About Buying Privacy Technology

Determining the best privacy technology for your organization can be overwhelming. As Alyce director of data security and governance, Jeff Mayse notes, “it touches every system, and crosses all borders.” In other words, choosing, implementing, and maximizing the value of privacy technology for both external and internal stakeholders requires careful consideration, planning, thoughtful execution, and management.

Joining Mayse to examine How to Think About Buying Privacy Technology, is his colleague Andy Dale, General Counsel and CPO of Alyce and WireWheel’s Director of Product Marketing, Emily Schwartz. Held at the 2022 Summer Spokes Technology Conference (June 22-23), this wide-ranging discussion was moderated by Kevin Donahue, Sr. Director, Privacy Engineering at Ethos Privacy.

Identifying the privacy tech value for external stakeholders

Often, clients will say, ‘we have to do data subject rights (DSAR) so we’re going to look at some solutions.’ But, there’s a whole lot of elements to doing data subject rights from ingestion, validating, and finding their data, to tracking things.

It’s not just one big black box, it is an entire workflow needed to satisfy their users.

—Kevin Donahue, Ethos Privacy

It’s important, suggests Dale, “to understand what it is we’re trying to do – not from a technical perspective and not from a GDPR or CCPA perspective – but the business problem or solution that we’re trying to get to. That helps all your privacy champions around the business get context and understanding.

“Law and regulation don’t really move the needle as much as folks like us might think it would. Risk surface area doesn’t do that either. What does move the needle is customer sentiment. “I don’t think everybody starts there. There is a tendency to dive right into spreadsheets and data mapping.”

Identifying the privacy tech value for internal stakeholders

“It’s really hard” he continues, “to work towards a successful outcome if everyone’s not on the same page from that 30,000-foot view. What are you trying to do and what are the impacts? This includes the internal stakeholders as well.”

“If you neglect any aspect here – external or internal– you’re at risk of pain. This is one of the most complicated systems that you will implement at a company,” warns Mayse.

It touches every system. It crosses all borders. It has no domains. And it can quickly spiral out of control.

You need to consider your internal users, at least as carefully as your external users. And when we talk about implementation, it is easy for this to become a problem.

—Jeff Mayse, Alyce

Mayse insists, when you understand what that process is like for both internal and external stakeholders, and what that would look like with privacy technology versus without, it’s easier to sell internally.

“As time wears on, I think it’s going to become increasingly clear that there are internal stakeholders benefits [and this] needs to be part of the equation,” adds Schwartz.

Buyer be aware

You can sell into legal. You can sell into a separate privacy function. You can sell into privacy engineering, into InfoSec, into the CIO, and you can sell into the IT department. There are too many vectors, too many personas to think through.

It’s a good idea to encourage the buyer, once they’ve engaged in that cycle of learning about the software, to bring a lot of relevant stakeholders to the table.

—Andy Dale, Alyce

That includes stakeholders like business intelligence and analysts who “need to pull data out of the product and manipulated somewhere: it is truly everyone,” notes Schwartz.

“When you start talking to every team in the company, the magnitude of the problem can become overwhelming,” cautions Mayse. “So, it’s not only important to bring in internal stakeholders…and raise issues early, but also to constrain the problem. You have to eat the elephant one bite at a time.”

It is critical for buyers to have this awareness.

Accepting that sales call without knowing what it’s going to take to implement a solution and how to perform the initial step of the development lifecycle when you’re evaluating build v. buy – if you don’t have somebody functioning in a product management role it can be very difficult to get it accomplished.

—Jeff Mayse, Alyce

Interestingly, both Dale and Schwartz have found that when it comes to awareness, the learning curve is much steeper in larger organizations than it is in the small and midsized firms (SMBs).

At Alyce, a privacy technology consumer (and WireWheel customer), they think about what they need to do first, relates Dale. “What engineering work needs to be done here to make sure that we’re even in a position to buy privacy tech?

“The issue is always, what do we have to solve first? What is our workload to implement a solution? There’s this myth about technology in general: buy it, turn it on, and magically it just happens.”

Privacy tech success metrics

How do you define success?

Privacy people think of success as ‘I’ve deployed a tool to do something, because I want to do that thing.’ And it’s oftentimes somewhat divorced from other stakeholders, whether it’s engineering, marketing, infrastructure teams, or even the users themselves.”

—Kevin Donahue, Ethos Privacy

“It comes down to why you bought it in the first place,” says Mayse. “What you wanted to achieve. You went in with a problem – maybe an intangible – but you have a problem that you’ve defined.”

“How do you track that problem now? Do you have a process for DSAR ticketing details and following through on the fulfillment through all systems and tracking time? Simply deploying a solution isn’t ‘the solution.’”

Pointedly, Mayse asks, “How did you measure the problem? Presumably you had to get dollars for this and justify the spend. How did you sell it internally (this is an $X/year problem, or a $Y/year risk surface). You can track against that.”

“One of the big ones for Alyce has been ‘time of effort.’ If it took you so many days/hours/minutes to fulfill [DSAR] requests, what does that look like after you buy the service? How many are you able to accomplish with how many people working on them?”

Sometimes it is a little bit of trial and error.

With experience you start to realize what is truly important. It may also be things like how the privacy technology integrates into your stakeholders workflows and processes. Is it efficient for your team? At the end of the day, a lot of this is about managing risk.

—Emily Schwartz, WireWheel

“Look at this through a risk lens,” offers Schwartz. “What are the building blocks that contribute to an acceptable risk profile for your organization and measure those.”

Project managing privacy tech: it’s all about communication

“There’s a big language challenge between legal and engineering,” opines Dale, “particularly in the privacy world where we live in the grey. Engineers and product teams, want the requirements. What are the milestones? What do we need to deliver, and the metrics behind it?”

“It’s going to be a little bit of iteration, trial, test, and learn as we go. But clearly discussing how this is a grey area can increase velocity in these projects.”

“You need somebody who can help translate the legal principles,” says Mayse. To draw lines in the grey legal sand and say “‘on this side of the line we think this is a good faith effort that’s compliant with the law’ while knowing that legal decision may indeed change. This must be translated into something that an engineer can build, or a product manager can run.”

Having embedded privacy champions in a development team really helps. It becomes a bi-directional communication between privacy (or compliance or legal), and the engineering team. In this way, engineering can communicate what they need from privacy to work effectively.

—Emily Schwartz, WireWheel

Evaluating privacy technology

“One of the most beneficial things you can do is ask your peers what they’re using, what they’re doing, what’s working for them and what’s not,” proffers Schwartz. “If you’re working with an agency or a consulting group, ask what they’re implementing for their clients. It shows what’s working what’s not working.

“You can do all the research in the world and see all the demos in the world, but just word of mouth and referrals are very powerful.”

My superpower recommendation is – while maybe not in the first call – when it comes to the demo, and especially to the technical discussion, bring an engineer who will ask questions you didn’t think of asking.

Those questions will be necessary, because an engineer is going to immediately start thinking about how they would actually do this…You can have an engineer say, ‘we won’t build this, it’s not going to happen,’ and you want to know that.

—Jeff Mayse, Ethos Privacy

“There is a lot of the tech… a lot of it does the same stuff, so I would have a list of your top needs and priorities,” avers Dale. “But if you’re not attentive to it and if you don’t have a person that is going to learn and administrate it,” he cautions,  you’re going to get nothing out of it.

“It can be the best software in the world, but any system requires a lot of TLC to get the most out of it. So go in knowing that it’s going to be a system you need to spend time with. Otherwise, you just won’t adopt it and you won’t get value.

“You’ll turn off that software and you’ll try another one. And if you don’t change something, the same cycle will happen again.”

  • Privacy Law Update
  • Regulations

FTC Issues Rulemaking to Protect Consumers

On August 11, 2022, the Federal Trade Commission (FTC) voted 3-2 to file an Advance Notice of Proposed Rulemaking (ANPR) regulating consumers’ privacy and data security. The rulemaking is called “Trade Regulation Rule on Commercial Surveillance and Data Security”.

From day one of her appointment, Chairwoman Lina Khan has been direct about having a heightened focus on outlawing unfair and deceptive business practices.  The ANPR is consistent with this anticipated posture.  The FTC has made it clear that privacy and data security issues particularly around children, AI, commercial surveillance, dark pattern practices, and lax data security practices, are central to the ANPR. The ANPR also includes civil penalties signaling the FTC’s attempt to regain its enforcement power.

The ANPR covers data collected directly from consumers, and data automatically collected by companies when consumers are on their websites and apps.  The FTC is considering whether new rules should be directed at the collection and use of sensitive/protected categories or if they should be applied more broadly to all categories of personal information.  Of particular interest is the ANPR definition of a consumer.  It includes “businesses and workers, not just individuals who buy or exchange data for retail goods and services,” This has already raised concern by many in the industry and is sure to evoke many comments.

This move is seen as the FTC’s attempt to formalize a national privacy regulation, though it does not appear to be as comprehensive as proposed federal legislation. Further, it does not preempt the current state privacy laws.  To that point, all five FTC commissioners have opined that they would prefer Congress pass a federal privacy law rather than the agency draft rules.

The FTC is asking for comments on 95 questions that address several key issues:

  • Automated Decision Making:  Measuring and identifying algorithmic errors
  • Balance:  Balancing the costs and benefits of privacy and security regulations with their impact on impeding or enhancing competition
  • Consent:  Understanding the effectiveness of consumer consent and contemplating different choices and standards for different types of consumers
  • Discrimination:  Regulating algorithms to prevent discrimination
  • Enforcement:  Empowering the FTC with more enforcement power
  • Governance:  Establishing rules for the collection and use of biometric information, facial recognition, fingerprinting, targeted advertising, and data retention
  • Harm:  Understanding how “commercial surveillance practices or lax security measures” harm consumers, specifically children
  • Notice:  Establishing rules for transparency, consumer comprehension, and privacy impact assessments
  • Security: Requiring technical, and physical data security measures and the certification of those practices

The rulemaking process the FTC will follow is guided by the 1975 Magnuson-Moss Warranty Act.  The next steps are to:

  • Allow for public comment on the ANPR
  • Issue a Notice of Proposed Rulemaking.  Comments may be submitted until October 10th
  • Host a public forum
  • Issue a final rule
  • Undergo a judicial review

The FTC is clearly establishing its privacy beachhead.  Will the ANPR move forward and become the national privacy framework or will it act as the catalyst for Congress to pass a comprehensive federal privacy bill?  We will continue to monitor this subject as it progresses and provide additional updates.

  • Privacy
  • Privacy Tech

Differential Privacy: Lessons for Enterprises from U.S. Government Agencies

Differential privacy is considered perhaps the only viable and formal way of publicly releasing information or allowing queries or analysis on data while protecting the privacy of the individuals represented in the data.

—Amol Deshpande, WireWheel

Enterprises trying to keep a rapidly changing regulatory landscape can benefit from looking at how their public sector counterparts have leveraged technological advances to publish data while respecting the privacy of individuals. While most organizations are not looking to publish data publicly, differential privacy can also enable internal data sharing between departments that are otherwise prescribed by regulation where accuracy or fitness for use of personal data is just as important. It is eminently translatable into commercial settings.

To discuss how differential privacy is being used, a group of experts met at the 2022 Summer Spokes Technology Conference (held June 22-23) to present their experiences in both academic and commercial settings.

The panel Differential Privacy: Lessons for Enterprises from U.S. Government Agencies was hosted by WireWheel Chief Scientist and Co-Founder Amol Deshpande. He was joined by the Co-Founders of Tumult Labs, CEO Gerome Miklau and Chief Scientist Ashwin Machanavajjhala, and Urban Institute Principal Research Associate and Statistical Methods Group Lead, Claire McKay Bowen.

Balancing insights with protection

Differential privacy technology seeks to solve the problem of balancing acquiring insights into groups of people while protecting facts about the individuals. For example, providing insight into a city’s taxi drop-off policies while protecting sensitive information such as location data.

Shared Group Insight
59th Street median rush hour weekday drop–off frequency: 145

Sensitive Individual Records
Customer (x456) traveled from LGA to 59th Street arriving June 1 at 8:30am.

The ability to share and reuse insights derived from data, while at the same time protecting individuals is notoriously hard.

And many of the methods that have been used in the past to share data, while protecting individuals privacy have fallen prey to a range of increasingly sophisticated attacks.

—Gerome Miklau, Tumult Labs

“It’s well understood that re-identification attacks can be very successful and lead to privacy vulnerabilities,” notes Miklau. “Aggregate statistics…are subject to reconstruction attack, even data used to train machine learning models are subject to membership inference attack. Differential privacy is a response to all of these attacks and provides a framework for reliable data sharing.”

How does differential privacy technology work?

Differential Privacy
A standard for computations on data that limits the personal information that could be revealed by the output of the computation

Miklau explains the application of differential privacy (DP) means that if one seeks to perform computations on sensitive data, the computation must be rewritten to satisfy the DP standard.

Once we do the computation in this standard-satisfying way, we get output that has a very formal and rigorous guarantee that [this output can’t] be reverse engineered back to individual records or facts, so we can reuse and share that output.

The privacy guarantee stands up in the face of powerful adversaries.

—Gerome Miklau, Tumult Labs

Importantly, it means that every attribute of the individual gets equal protection, so choices don’t have to make between sensitive and non-sensitive data, notes Miklau.“ Differential privacy as a technology for protecting privacy resists both current attacks that we know about and future attacks. It’s also ahead of regulation.

When running DP computation there’s a [tunable] ‘privacy loss parameter’ [the epsilon (ϵ)]…to set policy about how much information goes out when we publish, explains Miklau, and “allowing enterprises to do what we call privacy accounting:” to account for the tradeoff between data privacy and data accuracy.

Differential Privacy: Managing Cumulative Privacy Loss

“Major U.S. government agencies are excited about differential privacy for compliant data sharing, says Miklau. “The value drivers include:

  • Faster, safer movement of data enabling faster decisions making
  • The ability to perform institution-wide privacy accounting, and
  • A principled negotiation of privacy. There’s always a tradeoff between the strength of the privacy guarantee and the accuracy of outputs.

Importantly, as Miklau points out, the central use cases is the reuse and repurpose of data. This is valuable internally within a business enterprise as well. “When you have sensitive data…by computing a differential product summary of it, you can then move that around the enterprise much more easily and enable the use of data” for new insight and purpose.

IRS Case Study

The motivation for the collaboration between Urban Institute and the IRS is to advance evidence-based policymaking.

However, full access to these data are only available to select government agencies and the very limited number of researchers working in collaboration with analysts in those agencies.

— Claire McKay Bowen, Urban Institute

To expand access, the Urban Institute came into collaboration with the IRS Statistics of Income Division where it is “developing a synthetic data and differential privacy tools to allow researchers access to sensitive data, while providing robust privacy protections,” says Bowen.

There are two ways in which researchers have accessed data, explains Bowen: 1) from public statistics and data, and 2) the confidential data directly.

Data Flow with Statistical Disclosure Control

The above diagram describes the flow between the data user to a privacy expert (such as Bowen) and the curator (in this case also the data Steward) the IRS. This scenario is not ideal. As Bowen explains, it is very difficult to get direct access as it is constrained by clearance requirements.

Synthetic Data Technique for Data Analysis

Utilizing public statistics and data “isn’t a great option either. Given the growing threats to public-use micro data (i.e., individual record data) and concern for protecting privacy, it has led to government agencies progressively restricting and distorting the public statistics and data information being released.” Making it much less useful.

To solve this challenge, Urban Institute has devised two approaches.

  1. Improving the quality of the public data that is released using synthetic data generation technics, and
  2. Creating a new tier between the public data set and the direct access to data.

The synthetic data generation technique , says Bowen, has allowed organizations like the American Enterprise Institute and the Urban Brooking Tax Policy Center to conduct valuable micro simulation modeling to assess, for example, how much Medicare for all is going to cost the average taxpayer.

As written about here, synthetic data is not actual data taken from real world events or individuals’ attributes, rather it id data that has been generated by a computer to match the key statistical properties of the real sample data. Importantly, it is not the actual data that has been pseudoanonymized or anonymized. It is artificial data that does not map back to any actual person.

While synthetic data is a great solution for this application, “it’s not perfect, because you don’t know if there’s some other valuable analyses you didn’t quite account for when you built the model,” explains Bowen and this is where the “new tier of access that we’re developing called a validation server” comes in.

In this approach, the researcher i) enters in their analyses, ii) the analyses accesses the actual data iii) the output generated passes through the differential privacy mechanism (the “noise”), and then iv) analyses are provided with privacy protected.

“The privacy-utility tradeoff here is that you’re not looking at the original data but you’re still getting a result that (hopefully) is very similar to the original,” opines Bowen.

Bowen points to commercial use cases: For example, LinkedIn looking for ways to communicate their market data to different departments that didn’t  have access to the data or Uber trying to share their ride-share data internally using aggregation techniques to preserve privacy.

U.S. Department of Education case study

“The College Scorecard website is an initiative led by the U.S. Department of Education (DoE) to empower the public to evaluate post-secondary education options more empirically,” explains Machanavajjhala.

The department education has access to data about college costs, records describing students, and degrees attained, “but they don’t have the most valuable and interesting aspects of the website: the student economic outcomes metrics,” details Machanavajjhala. That information is with the IRS, and consequently, “the DoE must go the IRS and ask for this information every year.”

The IRS cannot just hand over the data to any other agencies as they’re bound by law to protect all information provided by tax returns – even the fact that somebody filed a return.

So, every year, the IRS has to deal with this problem of what should be released to the DoE, and how to transform the data to protect privacy.

—Ashwin Machanavajjhala, Tumult Labs

This challenge, notes Machanavajjhala, is becoming increasingly prevalent since the passage of the Evidence-Based Policy Making Act in 2018 and the consequent intensified requests for data sharing.

“The data custodian [in this case the IRS] is faced with a number of challenges,” says Machanavajjhala. “In past years, data protection was based on simple ad hoc distortion of the statistics and suppression. But the rules used were rendering most of the data useless (with upwards of 70% to 90% of the data redacted).

“Furthermore, the disclosure process was manual, so it was extremely onerous and becoming harder every year as analysts requested ever more data and detailed statistics.”

Solving the data sharing problem

To solve this challenge, Tumult Labs designed a three-step approach:

“Differential privacy is not a single algorithm,” cautions Machanavajjhala. “It’s an entire methodology. So, the right algorithm for your problem may require a significant amount of tuning to ensure the data that is output actually meets the needs of your application.”

Putting this in context of the IRS-DoE data sharing challenge, Machanavajjhala details that the DoE was requesting 10-million statistics involving 5-million students. Each of these statistics were either  a count or a median or quantity of different overlapping groups of students. A complex request and very fine grained.

Data Sharing Problem Solution

As “any privacy method is going to distort the data in some way…to ensure privacy of individuals but preserve statistics, it is important to know what characteristics analyst care about.” This requires establishing “fitness for use requirements,” which in addition to any specifically requested, will likely include:

  • A prioritized list of output statistics
  • The least privacy loss +
  • As much useful data as possible = (the privacy ϵ “negotiation”)
  • Error measures (“privacy accounting”) such as relative error, and number of records suppressed
  • Inconsistency between statistics, and
  • Ranking inversions

Tumult Lab developed software tools to enable this “negotiation” and enumerate the tradeoffs between privacy and accuracy when tuning the differential privacy “loss parameters (ϵ).”

Importantly, notes Machanavajjhala, deployment, should be community vetted, support peer review, enable safe privacy operations, and ensure there is an approval process to catalog and manage data releases, and audit and monitor them.

Ultimately, the Tumult differential privacy platform provided the IRS with a rigorous, automated, and quantifiable differential privacy guarantee and simplified decision making. All while enabling the release of more student income statistics than ever before with comparable accuracy and power the College Scorecard website.

Automated, quantifiable privacy guarantees with simplified decision making are important attributes for the privacy office and the business in commercial settings. But, as Bowen notes, “most different privacy technology is still found in highly technical papers so getting an expert on your team to filter through and figure out what is applicable what’s not,” is essential.

And as it is new technology, she advises “thinking about what training, education materials, and other support you need to get people up to speed.”