Free, SPOKES Privacy Virtual Conference June 22 and 23

REGISTER NOW

Blog

  • Company
  • Privacy

Retrieve Unstructured Data and Save Time With WireWheel’s M365 Integration

Privacy Laws continue to proliferate across the globe. Many of these laws, including the European Union’s GDPR, require companies to provide customers and employees access to their personal data.  Many companies have unstructured personal data covered by laws such as GDPR in applications in Microsoft 365.

Before the M365/WireWheel integration, collecting, reviewing, and producing information from M365 could be very time-consuming as the process of retrieving personal data required a manual search into each Microsoft service.  The easy-to-deploy Microsoft 365 compliance center and WireWheel integration automates the process of finding and retrieving personal data – ensuring more thorough results, saving employees thousands of hours, and making Data Subject Access Rights (DSAR) fulfillment easier.

WireWheel + Microsoft 365 – How it works

  1. Using an intake form generated by WireWheel’s Trust Access and Consent Center, an employee, former employee, or customer makes a Data Subject Access Request (DSAR) to view the personal data that a company has stored on them.
  2. Once a DSAR has been received, WireWheel automatically triggers a process to retrieve that data in Microsoft 365. Within the Microsoft 365 compliance center console, a reviewer can view the data that has been retrieved and redact/annotate it if necessary. Once the review is complete, a report is generated and pushed into WireWheel.
  3. The WireWheel platform aggregates all of the data on the requestor from all of the company’s systems and delivers a final report to a secure, encrypted portal where the requestor can access it.

Save money, save time and ensure compliance with WireWheel + Microsoft 365.

 

Efficiently fulfill DSAR requests with WireWheel and M365

 

Contact us for more information

  • Privacy

Privacy Governance: ROPAs are the New Normal

A key component of privacy governance is assessments. While Records of Processing Activity (ROPAs) do not assess risk per se, they do assess the who, what, why, where, and how of information processing. Analyzing the results against internal policy, processes, and regulatory requirements to determine potential areas of risk is critical.

—Tara Jones, Senior Manager Global Data Privacy Compliance & Governance, Yahoo

We recently met with Tara’s colleague, Yahoo AGC, Katie Pimentel. Katie spoke compellingly about the benefits of adopting the NIST Privacy Framework as an effective vehicle for implementing data privacy governance and the value of ROPAs as an internal risk assessment tool.

But the framework is only part of the story. Successful implementation requires socialization, training, and teaming across the organization to enable the accurate collection of ROPA data that serve as critical inputs to privacy governance.

Tara Jones, Senior Manager Global Data Privacy Compliance & Governance, Yahoo, joined WireWheel CEO Justin Antonipillai, Director of Privacy, Lisa Barksdale, and her colleague Yahoo AGC, Katie Pimentel at the breakout session “Rise of the Privacy Operations Leader” at the IAPP Privacy. Security. Risk. 2021 Conference.

Ms. Jones was kind enough to speak with us to offer a small preview of their planned presentation and the “new normal” of ROPAs.

Competing Priorities

I think that we’re pretty lucky because, for the most part, the majority of our product owners really do want to comply with the regulations. It has been a relatively smooth process in terms of the communication with the product owners filling out the ROPAs.

Of course, we had to chase down a few to try to get them to respond, but for the most part, a couple of emails back and forth would suffice.

Some challenges, notes Tara, are logistical. For example, “when product owners leave the company or change departments, tracking down who will have responsibility going forward.” Yahoo’s separation from Verizon created some challenges, as well as some product owners, went to Verizon and others came with Yahoo. To solve for this, notes Ms. Jones, “we developed a process for identifying product owners within the WireWheel system.”

All this is all to be expected of course. While everyone gets the importance of completing ROPA questionnaires it represents just one of a long list of priorities. “For the most part,” says Tara, “the product owners would go into WireWheel and update their ROPAs. They want to be compliant. And they definitely don’t want to be on that report that goes to their second or third level manager that says that they’re not.”

Communication and Continuous Improvement

Having escalation and communication protocols in place was key to Yahoo’s success in driving compliance as they implemented their NIST framework privacy program. When Tara communicates with the business unit owners, they know what is expected including an acceptable turnaround time. If expectations aren’t met, an escalation process kicks off that includes executive reporting on meeting the ROPA process metrics. Codifying the ROPA process and metrics, and clearly communicating expectations, benefits both the program and those responsible for the inputs.

The more than 700 people that engaged on the initial project will continue to engage going forward to recertify the ROPAs and support iterative compliance testing – including improvements to the ROPA questionnaires. “This is ongoing,” explains Tara, “and what we refer to as their ‘new normal,’ as the component owners will now have to certify them every year.”

Importantly, this thoughtful approach, inclusive of the communication routines, creates a closed-loop-control process that enables continuous improvements to data inputs. This in turn improves internal risk assessments improving overall privacy governance.

Right now, we’re going back and we’re analyzing all of the ROPAs that we have compiled over the last year. We send a general email to everyone involved. They receive an update request from us via email with a link to an FAQ that communicates what we’re doing; that we have analyzed the ROPAs; and now have follow-up questions that we need you to answer.

This follow-up includes feedback from Privacy indicating possible corrections to questionnaire responses (e.g., opinions on appropriate data retention schedules) and asks for those updates. Notably, however, Privacy also asks for feedback from owners to continually improve and refine the process.

“The responses that we’re getting are really, really good and insightful. Even if they’re unable to complete the ROPA for some reason, they’ll come back to us and say, ‘I tried to complete it but some of the options that you had available don’t fit my product.’ This lets us know that we need to go back and look at the ROPA and offer more options for people who have different types of products.”

Privacy at Scale

Unless your organization is extremely small, using a tool such as WireWheel to manage these assessments is a critical component to the success of your governance program. This is one of those items that may seem like an unnecessary expense as you are starting down the privacy governance path but will prove to be a huge asset as you move into years two and three and move more into the maintenance, testing, and certification phases of the program.

When hearing from experts at large companies, privacy professionals at smaller companies may assume that absent the expansive resources of a Yahoo – audit teams, robust cybersecurity infrastructure, et al. – the programmatic approach advocated is out of reach. Not so.

There are a plethora of available technologies and consulting services that are scale-appropriate for your organization. And keep in mind, the cost to your organization of ad hoc approaches (read burden, dollars, and risk) will likely far exceed the cost of implementing a sound privacy governance framework.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy

Leveraging NIST’s Privacy Framework for Privacy Governance

We are seeing a parallel to what the financial and banking industry went through during the early years of Sarbanes Oxley (SOX) implementation. In the same way that we can no longer rely on self-regulation, we are no longer able to rely on disparate compliance mechanisms with little to no enforcement.

—Katie Pimentel, Assistant General Counsel, Yahoo

Whether your organization is just beginning its privacy journey or engaged in operationalizing privacy by design, establishing a framework within which to operate is vital. An ad hoc or loosely defined approach will often result in recurring errors, poorly aligned resources, excessive cost, and ultimately suboptimal outcomes that may put the organization at risk.

Many privacy professionals have come to see the parallels between the maturation process of the information security sector, and as Ms. Pimentel notes, the “early years of SOX implementations.” So why not leverage the lessons learned, available tools, and expertise to ease the initial heavy lifting needed to create an effective and sustainable privacy governance program?

Katie Pimentel Assistant General Counsel joined WireWheel CEO Justin Antonipillai, Director of Privacy, Lisa Barksdale, and her colleague Yahoo Legal Services Senior Manager, Tara Jones, at the breakout session “Rise of the Privacy Operations Leader” at the IAPP Privacy. Security. Risk. 2021 Conference.

Ms. Pimentel kindly spoke with us to offer a small preview of her planned presentation on the value of leveraging the NIST Privacy Framework to achieve effective governance.

Why Choose the NIST Privacy Framework?

According to Katie, “What we noticed with this framework is that it is very high level, but it provided Yahoo with a foundation to build off of and allowed us to pick and choose what parts of it apply to our business and industry and what parts didn’t.”

“The short answer,” says Katie, to why NIST? “is that the Yahoo IT security team (which is aptly named The Paranoids) is already leveraging the NIST Framework from a cybersecurity perspective, so it really made for a nice alignment as a privacy framework.”

And, while other governance-type models were being considered by Yahoo, none fit all the required elements of a privacy governance framework at a top-level quite as nicely as the NIST framework. “That our internal teams understood the framework and controls language – at least from a security perspective – was a big plus,” further notes Ms. Pimentel.

We quickly realized that we needed to have an infrastructure and governance model that is able to support aggregating the information, putting it into a system, and provide a way to more easily produce ROPAs and associated documentation.

Some of the impetus for adopting an established framework like NIST is to improve responsiveness (read time and cost) to regulatory inquiries.

“Complying with GDPR Article 30 documentation (ROPAs) can take a very long time for organizations to put together. And once you collect all that information, you’re not only maintaining it. You’re testing it. You’re making sure that it’s being updated on a regular basis. You’re making sure that you’re training on it and communicating your policies that back that up,”

It’s a Top-Down, Bottom-Up Approach

The “one size fits all” of the NIST Privacy Framework should not be mistaken for rigidity that is often associated with solutions that can’t accommodate the unique requirements of individual organizations. By definition, a framework is a top-level structure (the scaffolding or bones) on which to build to your organization’s specific requirements. This translates into significant flexibility. In this regard it is, says Katie, “agnostic.”

“Leveraging this flexibility, we created a regulation crosswalk, within the framework” explains Ms. Pimentel. “We took the top regulations (such as the GDPR, CCPA, and LGPD), and we mapped, or cross-walked, them into the framework.”

We had all the framework controls that we drafted, at least from a high level, put them in a spreadsheet, and mapped which section of those regulations a control satisfied. We could then see that one set of controls or area of the framework helped us meet several of the obligations within each of these regulations.

“It’s not a one-to-one relationship,” continues Katie. “We achieve the 80/20 Pareto Rule where we hit about 80% of them with the crosswalk view. Importantly, the crosswalk allows us to see across all of these different regulations and highlight where there might be gaps.

The top-down is the framework, the bottom-up is the ROPAs, “and they sort of meet in the middle,” she explains. “The ROPA provides us with the information – the actual data that allows us to understand what types of controls you need, and where risks might exist that indicate the types of reviews and governance we should focus on. The framework drives some of what we’re asking as well, based on our organization and our industry.”

Ultimately, “working within a framework like NIST will provide the needed scale and repeatability” necessary to build a successful privacy governance program.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy Law Update

Privacy Law Update: October 18, 2021

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


The White House is having a big meeting about fighting ransomware. It didn’t invite Russia

The White House has held a meeting with ministers and officials from 30 nations and the European Union to discuss how to combat ransomware and other cyber threats.   The two-day series of meetings aimed to find an answer to ransomware and followed calls from US president Joe Biden for the Kremlin to hold Russia-based ransomware gangs accountable for their file-encrypting attacks, rather than turning a blind eye to them so long as they don’t attack Russian organizations.  Notably absent from the White House-led group was Russia itself, which was not invited. In June, Biden told Russian President Vladimir Putin that 16 US critical infrastructure entities should be off-limits from ransomware attackers operating from Russia.

Biden signs K-12 Cybersecurity Act, and more on children’s privacy

U.S. President Joe Biden signed the K-12 Cybersecurity Act, which aims to protect sensitive information maintained by schools. In a statement, Biden said the bill will address threats to students’ and educators’ privacy created by COVID-19, adding his administration will provide “important tools and guidance to help secure our school’s information systems.”

As data breaches near ‘all-time high,’ Senate committee talks regulation

This week, the United States came just 230 data breaches away from an “all-time high,” according to Identity Theft Resource Center Chief Operating Officer James Lee, but data security requirements either within a federal privacy law or a standalone regulation “would substantially improve data protection” and bring “stronger protections and greater clarity to the marketplace,” Kelley Drye Of Counsel Jessica Rich said.  With 446 reported data breaches from July through September, “we’re in for raising the bar substantially,” said Lee, who shared the statistics with the U.S. Senate Committee on Commerce, Science, and Transportation Wednesday. “Behind all these numbers are people,” he said. “They’re victims.”  The committee held its second privacy hearing in two weeks, with former members of the Federal Trade Commission testifying last week on a lack of resources within the agency to handle privacy and data protection challenges, and in support of a budget reconciliation package that would give the agency $1 billion over 10 years for a new privacy and data security division.

Irish privacy watchdog endorses Facebook’s approach to data protection

A draft decision from Ireland’s Data Protection Commissioner (DPC) endorsing Facebook’s legal basis for processing personal data has been met with criticism by a data protection activist who says the platform is trying to bypass EU privacy laws.

Privacy Legislation

Personal Data Protection Law enacted in Saudi Arabia

On 24 September 2021, the long anticipated Personal Data Protection Law, promulgated by Royal Decree No. M/19, dated 09/02/1443H (corresponding to 16 September 2021) (“Law”), was published in the Saudi Official Gazette (Umm AlQura). The Law was developed by the Saudi Data and Artificial Intelligence Authority (SDAIA), which will be the competent governmental authority (“Data Authority”) to administer the Law for a period of two years but it may thereafter transfer such competence to the National Data Management Office (NDMO). The Law will come into effect on 23 March 2022, at which time the Data Authority will be required to issue the Law’s implementing regulations (“Regulations”). Controllers (as defined below) will have one year from the effective date to comply with the Law.

China’s draft algorithm regulations: A first for consumer privacy

The People’s Republic of China broke new ground by announcing draft regulations on the widespread use of algorithmic recommendation technology. The regulations are, according to one expert, the first of their kind globally. And because China will soon exceed one billion internet users — roughly 20% of global internet users — these regulations will cover nearly one in five users on earth.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • CCPA & CPRA
  • Privacy
  • Regulations

How Privacy Laws Could Help Regulate Facebook’s Algorithms

Congressional testimony from a former Facebook employee has sparked outrage over the governance of the company’s algorithms and has renewed calls for regulation of the social media giant.

Although privacy laws ostensibly focus on data, GDPR in Europe and a set of emerging laws in the U.S. are well-positioned to play an important role in how regulators might govern Facebook’s algorithms. A regulatory action on Facebook could serve as a harbinger for future regulation over media, finance and other technology-driven companies that rely on algorithms and personalization to power their digital experiences.

How Privacy Laws Could Help Regulate Facebook’s Algorithms

The main thrust of the testimony given by Frances Haugen, the former Facebook product manager who shared extensive internal documents with congress, focused on the algorithms used to determine the content users see in their newsfeeds. Haugen’s criticism was simple and familiar: these algorithms, optimized for engagement, have deleterious effects on our individual and collective health — and yet users and regulatory bodies have no insight into, or control of, the decisions embedded in their code.

Modern privacy laws focus primarily on giving users power over the personal or sensitive information organizations collect. However, these laws also provide individuals with rights over the processes and algorithms that put those data into action.

“Privacy laws are not simply about protecting what companies know about you,” said Rick Buck, Chief Privacy Officer at WireWheel.” “They are also about protecting what organizations can do with that information.”

Privacy laws regulate algorithms by granting users rights over two types of activities: data profiling, the creation and segmentation of users based on personal data, and automated decision-making, which often is based on those profiles. These provisions are already active in Europe (GDPR), Brazil (LGPD), and China (PIPL) and will come into effect in California (CPRA), Colorado (CPA) and Virginia (CDPA) in 2023.

What is Data Profiling and Automated Decision-Making — and Why Does it Matter?

Data profiling is broadly defined as “automated processing of personal data to evaluate certain things about an individual.” Profiling is frequently used in digital advertising in which marketers create “profiles” of users (e.g. “high income”, “repeat buyer”) to influence ads, but also supports an array of other non-advertising use cases (e.g. personalization, automated loans)

Automated decision-making refers to “the process of making a decision [based on that profile] by automated means without any human involvement.” These include a range of activities from an online decision to award a loan, and an aptitude test used for recruitment, an ad targeted to a user, and in Facebook’s case, a decision around what we see in our newsfeeds. GDPR grants users the right not to be subject to decisions based solely on automated processing which product “legal, or similarly significant effects.”

A DEEP DIVE INTO AUTOMATED DECISION MAKING

FUTURE PRIVACY FORUM

Our colleagues at the Future of Privacy Forum have put together a deep dive on automated decision making considerations for policy makers. Below is a chart comparing existing requirements around automated decision-making ] between GDPR (Europe), CPRA (California), and CDPA (Virgina.)

These provisions have a potentially massive impact for technology platforms like Facebook. If the effect of the algorithmic decision or profile is deemed “significant,” as Haugen is arguing about Facebook, individuals will have the right to demand an alternative experience in which the content in their newsfeed is not determined by the more advanced algorithm (e.g. potentially, a reverse chronological timeline).

The impact here extends well-beyond social media. A host of other industries including traditional media, digital finance and e-commerce rely on algorithms to sort content, provide recommendations or evaluate customers in what could be considered a “significant” way. A regulatory action against Facebook could lead regulators to pay closer attention to the way algorithms impact user experiences and rights in these areas as well.

“Algorithms are the next frontier of privacy,” says Buck. “Businesses not only need to consider how they might enable users to opt-out or opt-in of ‘personalized’ experiences at scale, but what a viable alternative experience might look like.”

  • Marketing

Where Are Companies on the Consent Management Spectrum?

Written by Jeremy S. Berkowitz, Senior Principal, Global Privacy
Promontory Financial Group, an IBM Company


As a Senior Principal in the Promontory Financial Group’s privacy and data protection practice, I consult with companies on a variety of privacy issues related to compliance, risk, governance, and record retention/deletion. I have seen clients both large and small struggle to understand privacy consent management requirements. With impending new regulations—both Virginia and Colorado have recently passed their own state privacy laws and the California Privacy Rights Act (CPRA), a more stringent version of the California Consumer Privacy Act will go into effect in January 2023—it’s a challenge that will become tougher over time.

In the past, companies largely viewed data privacy and compliance as a necessary evil to avoid the backlash and PR nightmares that came with significant data breaches. However, companies are now beginning to realize the value of being good guardians of consumer data. That idea has taken off in the last decade or so, with savvy entities (such as Apple) building advertising campaigns around a reputation as a “guardian of your data.”

That’s not to say that every company needs to loudly proclaim itself as a “data guardian.” Most companies, however, have seen the writing on the wall. They know that even if they do not do business in California and therefore not subject to CPRA, it’s crucial to make investments in data privacy and compliance now, as a patchwork of new state laws—largely based on California’s, but with slightly differing nuances—loom on the horizon. That’s why taking the first step—recognizing where your company is on the maturity scale and building a plan to advance—is so critical.

Often, companies think they need to apply a data privacy product off the shelf, and that’s all that’s needed for solving their problems. However, there’s more to it than that. It takes work to integrate products into their existing infrastructure, not to mention developing change management procedures required to make sure the foundation is in place to properly manage data privacy concerns. There’s more to advancing along the maturity scale than applying a technological solution—although that’s certainly an important part of it. Think of it as more of an evolution, with many moving parts that will involve technology, process improvements, governance, and so on.

Much of the work I do with clients centers around helping them to decide on the best choices they need to make to comply with consent management regulations. That means reviewing a company’s existing technology capabilities, helping them understand some of the choices they provide to customers from a compliance point of view, and working with them on defining these rules and determining what next steps they need to take to improve their consent management processes.

The maturity scale

In broad terms, a company’s position on the consent management “maturity scale” depends on how consent preferences are collected, how they are shared with vendors or other organizations, whether there’s a governing structure around consent management, whether these preferences are managed manually or handled through a CRM system, or whether there are any metrics in place that will allow an organization to determine whether the company is following these rules.

Level 1: Ad Hoc

For companies in the “ad hoc” phase, managing consents and preferences is often disorganized. At this level, consent preferences and rules are often manually updated.

This means that consent management is a laborious process, prone to error and misjudgment. Most often, there is no centralized database for consent preferences, and information tends to be siloed. At this stage, any consent preferences are not easily shared with others within the organization.

Level 2: Developing

At this level, if consent preferences are captured, they are usually done so in a rudimentary way—such as through email “unsubscribe” links for consumers to opt in or opt out. There is no mechanism for a consumer to know what data the company may have about him or herself—and the company usually doesn’t have it organized in any logical way.

Here, companies may also have the beginnings of a consent management strategy, but it is not fully fleshed out. At this stage, companies may have a database of consent preferences, but—again—it most likely is siloed from others, critical areas of the company where consent preferences would come into play. Marketing, for example, may not have a process or procedure in place that links it to the consent preferences database.

At this level there will also be early attempts at establishing some accountability for data privacy governance, along with some basic framework for how to deal with privacy choices and preferences. However, this framework may not yet be fully adopted.

Level 3: Defined

At this level, consent preferences are handled in a much more organized fashion, although there still exists some room for improvement. Consent preferences are captured on the company’s website and managed separately according to product or service, rather than under a single umbrella.

There’s also much more engagement with developing and maintaining a consistent and coherent data privacy strategy, with regular reviews of policies, rules, and progress.

Level 4: Managed

At the “managed” level, consents and preferences are stored and handled on a central CRM system, and you’ll also find that stakeholders within the company have a thorough understanding of what data is collected, where it’s managed, and how the consent and preferences framework is designed to function.

At this point, consent rules and notice requirements are fully developed and documented, with all stakeholders on the same page. Consent preferences have also evolved enough so that “just-in-time” consent (such as location-based or marketing-based consent obtained at the time of service) can be easily provided.

Level 5: Optimized

Companies at this level have developed a robust and fully mature consent management framework. They have invested considerable resources toward crafting a governance policy that informs their decision-making, and leverage data privacy technology to ensure compliance at every step. And one of the key points here is that companies at this level have also developed consent management metrics, so that they can track progress and spot issues before they become big problems.

Companies with an “optimized” level of consent management employ a privacy hub for preferences that data users can access and update in real time, and consent is also linked directly to browser data collected and use preferences with integrated content. That means much of the manual work of managing consents and preferences has been automated—including automatically sharing consent preferences with third-parties, such as vendors.

Next steps

For companies looking to enhance their consent management capabilities, they should consider taking the following steps:

  • Review their current consent management maturity with internal stakeholders understanding the processes, systems, and tools they use to collect and manage customer preferences
  • Determine their current consent management maturity as well as where they would like their maturity to be in the short- and long-term
  • Develop a roadmap for improving their maturity through means such as improving governance processes, upgrading IT capabilities, drafting new policies, and procuring new tools

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy Law Update

Privacy Law Update: October 11, 2021

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!

Newsworthy Updates

FTC alum Ashkan Soltani selected to lead CPPA

It was always going to be interesting to see who would be appointed the inaugural leader of the California Privacy Protection Agency. With the hiring process mostly closed-door and unpublicized, the selection was bound to catch people by surprise and did just that on Monday.  The CPPA announced Ashkan Soltani, former chief technologist for the U.S. Federal Trade Commission and senior advisor to the White House, will be its first executive director. Soltani was a key player in the drafting of the California Consumer Privacy Act and the California Privacy Rights Act while also a leading voice and advocate for the Global Privacy Control initiative.

FTC Chair Khan’s Vision for Privacy – and Some Dissents

Last week, we wrote about FTC Chair Khan’s memo describing her plans to transform the FTC’s approach to its work. This week, she followed up with a no-less-ambitious statement laying out her vision for data privacy and security, which she appended to an agency Report to Congress on Privacy and Security (“report”). Together, these documents outline a remarkably far-reaching plan to tackle today’s data privacy and security challenges. As noted in the dissents, however, some of the stated goals may exceed the bounds of the FTC’s current legal authority.

Facebook whistleblower’s revelations boost federal privacy law chatter

Members of the U.S. Senate Committee on Commerce, Science, and Transportation’s Subcommittee on Consumer Protection, Product Safety, and Data Security used a hearing with Facebook whistleblower Frances Haugen to lament the need for Congress to act on federal privacy legislation, The Wall Street Journal reports. Sen. Amy Klobuchar, D-Minn., explicitly called for the drafting of a comprehensive privacy law during the hearing while characterizing Haugen as “the catalyst for that action.” Haugen added that simply updating existing U.S. privacy laws “will not be sufficient.”  Full Story

How Product Leads Should Be Thinking About Educating Users on Data Collection

Consumers are more concerned about data privacy than ever. Through data breaches, legislation changes and shifts in technology, consumers have learned the importance of keeping their data safe and they’re short on patience for companies that don’t respect their security. Privacy has become even more important since the onset of the pandemic, which has shifted content consumption to even more digital channels where consumer data can be collected and leveraged for ad revenue.

European Data Protection Board Establishes Cookie Banner Taskforce, Which Will Also Look Into Dark Patterns and Deceptive Designs

The European Data Protection Board (“EDPB”), a body with members from all EEA supervisory authorities (and the European Data Protection Supervisor), has recently established a taskforce to coordinate the response to complaints concerning compliance of cookie banners filed with several European Economic Area (“EEA”) Supervisory Authorities (“SAs”) by a non-profit organisation NOYB. NOYB believes that many cookie banners, including those of ‘major’ companies, engage in “deceptive designs” and “dark patterns”.

Standardizing data-processing agreements globally

Privacy professionals around the world are feverishly working on configuring and implementing the European Union’s new standard contractual clauses. Effective Sept. 27, companies in the European Economic Area entering into new cross-border data transfer arrangements with companies outside the EEA based on SCCs must adopt the new versions. Any recipient that signs the new SCCs promises it has matching agreements in place with its own vendors according to Clauses 8.8 and 9. Myriad businesses are affected because every company has numerous affiliated and unaffiliated vendors and other business partners worldwide. To remain open to businesses from the EEA, all companies need to have the new SCCs in place by Sept. 27.

Privacy Legislation

  • Massachusetts Legislature to hold privacy hearing:  The Massachusetts Legislature’s Joint Committee on Advanced Information Technology, the Internet and Cybersecurity will hold a virtual hearing Oct. 13 to consider data privacy-related bills. The committee’s agenda features at least seven bills proposed by state lawmakers from both chambers that cover data privacy matters, including frameworks for comprehensive state privacy law, biometric privacy and education privacy. Meanwhile, Northeastern University School of Law and College of Computer and Information Science Professor Woodrow Hartzog wrote an op-ed supporting consideration of Bill S.46, the Massachusetts Information Privacy Act.
  • Connecticut Tightens its Data Breach Notification Laws:    Effective October 1, 2021, an amendment[1] to the Connecticut General Statute concerning data privacy breaches, Section 36a-701b, will impact notification obligations in several significant ways. The amendment:
    • Expands the definition of “personal information”;
    • Shortens the notification deadline after discovery of a breach from 90 to 60 days
    • Removes the requirement to consult with law enforcement as part of a risk assessment;
    • Deems compliant any person subject to and in compliance with HIPAA and HITECH; and
    • Provides certain exemptions from public disclosure for materials provided to the state in response to an investigation of a breach of security.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • CCPA & CPRA
  • Privacy
  • Regulations

Personal Data Definitions: Comparing GDPR vs CCPA vs CDPA vs CPA

Introduction

‘Personal Data’ has different legal definitions in the GDPRCCPA in CaliforniaCDPA in VirginiaLGPD in Brazil and other regulations.

Although personal data is sometimes used interchangeably with PII or personally identifiable information, “personal data” in the GDPR refers to a more specific and strict definition with specific examples and therefore is different (broader) than the PII.

Unfortunately for organizations, there is currently no global standard legal definition of personal data. While all regulations will follow a common approach, some frameworks are very specific and provide actual examples of personal data, while others are more vague and subject to interpretation.

If your organization operates in multiple jurisdictions, you will first need to understand the definitions under each regulation and which regulation(s) apply to the data you collect, use and store.

This will allow you to answer questions such as:

  • Which systems and processes store or use data covered under the different regulations?
  • What is my company’s obligation regarding the data?
  • How can I make sure that my company complies today and into the future?

Below, we will review the current definitions of personal data under key global data privacy and protection regulations.

Curious how personal data, data breach requirements, fines, and other provisions differ across key privacy frameworks? Access the Privacy Law Comparison Table to find out!

Compare Now

Personal Data Under CCPA

The CCPA established eleven categories of personal information and provided examples to illustrate most of these categories:

  • Identifiers: Name, alias, postal address, unique personal identifier, online identifier, Internet Protocol (IP) address, email address, account name, social security number, driver’s license number, passport number, or other similar identifiers
  • Customer records information: Name, signature, social security number, physical characteristics or description, address, telephone number, passport number, driver’s license or state identification card number, insurance policy number, education, employment, employment history, bank account number, credit or debit card number, other financial information, medical information, health insurance information
  • Characteristics of protected classifications under California or federal law: Race, religion, sexual orientation, gender identity, gender expression, age
  • Commercial information: Records of personal property, products or services purchased, obtained, or considered, or other purchasing or consuming histories or tendencies
  • Biometric information: Hair color, eye color, fingerprints, height, retina scans, facial recognition, voice, and other biometric data
  • Internet or other electronic network activity information: Browsing history, search history, and information regarding a consumer’s interaction with an Internet website, application, or advertisement
  • Geolocation data
  • Audio, electronic, visual, thermal, olfactory, or similar information
  • Professional or employment-related information
  • Education information: Information that is not “publicly available personally identifiable information” as defined in the California Family Educational Rights and Privacy Act (20 U.S.C. section 1232g, 34 C.F.R. Part 99)
  • Inferences

The CCPA does not consider publicly available information that is from federal, state, or local government records, such as professional licenses and public real estate/property records as personal information.

In addition, CCPA does not consider personal data the data that has been pseudonymized and de-identified or aggregated and de-identified and because it cannot be reasonably linked to an individual.

One of the key differences between the CCPA and GDPR is that GDPR is exclusive to the individual while the CCPA also includes information not only specific to an individual but also to a household.

To read more about the official definition of personal data under the CCPA, click here to access the official text (Section 1798.140.(o))

Personal Data under CPRA

The CPRA follows the definitions of “personal data” adopted in CCPA. However, the CPRA introduces specific categories of “sensitive data” defined as “personal information that reveals:

  • A consumer’s social security, driver’s license, state identification card, or passport number,
  • A consumer’s account log-In, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account,
  • A consumer’s precise geolocation,
  • A consumer’s racial or ethnic origin, religious or philosophical beliefs, or union membership,
  • The contents of a consumer’s email, and text messages; unless the business is the intended recipient of the communication,
  • A consumer’s genetic data.

You can learn more about the new sensitive data categories under CPRA by clicking here (on page 23, 1798.140.(ae)).

Personal Data Under Virginia CDPA

Under the CDPA, the definition of “personal data” means “any information that is linked or reasonably linkable to an identified or identifiable natural person. ‘Personal data’ does not include “de-identified data or publicly available information”

Unlike the CCPA, the CDPA does not provide examples of categories of personal information.

Like CCPA, the definition in CDPA excludes any de-identified data and publicly available information. Publicly available information is defined as “information that is from federal, state, or local government records”.

In addition, the CDPA adds to its definition of publicly available “information that a business has a reasonable basis to believe is lawfully made available to the general public through widely distributed media, by the consumer, or by a person to whom the consumer has disclosed the information unless the consumer has restricted the information to a specific audience.”

Similar to the CPRA, the CDPA introduces the definition of “sensitive data” which includes:

  • Personal data revealing racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexual orientation, or citizenship or immigration status,
  • The processing of genetic or biometric data for the purpose of uniquely identifying a natural person,
  • the personal data collected from a known child, or
  • Precise geolocation data, which is defined as information derived from technology, including but not limited to global positioning system level latitude and longitude coordinates or other mechanisms, that directly identifies the specific location of a natural person with precision and accuracy below 1,750 feet.

You can access the definitions of personal and sensitive data under the CDPA by clicking here (59.1-571- Definitions).

Personal Data Under Colorado CPA

The definition of ‘Personal Data’ under the CPA is closely related to that of Virginia’s CDPA and states that “personal data means:

  • (a ) information that is linked or reasonably linkable to an identified or identifiable individual, and
  • (b) does not include de-identified data or publicly available information.”

As used in this subsection (17)(b), “publicly available information” means information that is lawfully made available from federal, state, or local government records and information that a controller has a reasonable basis to believe the consumer has lawfully made available to the general public.”

In addition, the Colorado CPA does not include data “maintained for employment records purposes.”.

Similar to the CDPA and CPRA, the CPA defines sensitive data to “mean

(a) personal data revealing racial or ethnic origin, religious beliefs, a mental or physical health condition or diagnosis, sex life or sexual orientation, or citizenship or citizenship status,

(b) genetic or biometric data that may be processed for the purpose of uniquely identifying an individual, or

(c) personal data from a known child.”

To read more about the definitions of persona and sensitive data, please refer to the official text by clicking here (on page 8, 6-1-1303.(17) and on page 10, 6-1-1303.(24)).

Personal Data Under GDPR

Under the GDPR, “Personal Data means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.”

In addition, the European Commission clarified the above on its website via the Q&A section by mentioning that:
“Personal data is any information that relates to an identified or identifiable living individual. Different pieces of information, which collected together can lead to the identification of a particular person, also constitute personal data.

Personal data that has been de-identified, encrypted or pseudonymised but can be used to re-identify a person remains personal data and falls within the scope of the GDPR.

Personal data that has been rendered anonymous in such a way that the individual is not or no longer identifiable is no longer considered personal data. For data to be truly anonymised, the anonymisation must be irreversible.

The GDPR protects personal data regardless of the technology used for processing that data – it’s technology neutral and applies to both automated and manual processing, provided the data is organised in accordance with pre-defined criteria (for example alphabetical order). It also doesn’t matter how the data is stored – in an IT system, through video surveillance, or on paper; in all cases, personal data is subject to the protection requirements set out in the GDPR.

The website also lists examples of personal data under GDPR. These examples include:

  • a name and surname
  • a home address
  • an email address such as name.surname@company.com
  • an identification card number
  • location data (for example the location data function on a mobile phone)
  • an Internet Protocol (IP) address
  • a cookie ID
  • the advertising identifier of your phone
  • data held by a hospital or doctor, which could be a symbol that uniquely identifies a person

As importantly, it also lists examples of what is not considered personal data. These examples are:

  • a company registration number
  • an email address such as info@company.com
  • anonymised data

The GDPR also makes a clear distinction between personal data and sensitive data via the “Special Categories”. The Special Category include:

  • Race and ethnic origin
  • Religious or philosophical beliefs
  • Political opinions
  • Trade union memberships
  • Biometric data used to identify an individual
  • Genetic data
  • Health data
  • Data related to sexual preferences, sex life, and/or sexual orientation

The processing of special category data is prohibited unless:

  • “Explicit consent” has been obtained from the data subject, or,
  • Processing is necessary in order to carry out obligations and exercise specific rights of the data controller for reasons related to employment, social security, and social protection, or,
  • Processing is necessary to protect the vital interests of data subjects where individuals are physically or legally incapable of giving consent, or,
  • Processing is necessary for the establishment, exercise, or defence of legal claims, for reasons of substantial public interest, or reasons of public interest in the area of public health, or,
  • For purposes of preventive or occupational medicine, or,
  • Processing is necessary for archiving purposes in the public interest, scientific, historical research, or statistical purposes, or,
  • Processing relates to personal data which are manifestly made public by the data subject, or,
  • Processing is carried out in the course of its legitimate activities with appropriate safeguards by a foundation, association or any other not-for-profit body with a political, philosophical, religious or trade union aim and on condition that the processing relates solely to the members or to former members of the body or to persons who have regular contact with it in connection with its purposes and that the personal data are not disclosed outside that body without the consent of the data subjects

To access more information about the data in scope under GDPR, please refer to the official GDPR website (Article 4 – Definitions and Article 9 – Processing of special categories of personal data)

Conclusion

As you can see, the definitions of personal data vary from one privacy regime to the next. Make sure you have a good understanding of these legal definitions before you work on your data inventory and data mapping initiatives. This is the foundational step of any robust privacy program.

To compare the definitions of “Personal Data” and “Sensitive Data” side-by-side for all these regulations and others such as China’s PIPL, Canada’s PIPEDA, or Brazil’s LGPD, please check our Interactive Privacy Table.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • CCPA & CPRA
  • Privacy
  • Regulations

The DSAR Guide: Overview of Data Subject Access Requests

Last Updated: November 5, 2021


What is a DSAR?

Data Subject Access Requests (DSARs) give individuals (also known as data subjects) the right to discover what data an organization is holding about them, why they are holding that data and who else their data and other personal information is disclosed to.

DSAR is a term introduced by the European Union’s General Data Protection Regulation (EU GDPR), which refers to individuals as “data subjects.” It’s often used interchangeably with the term “Subject Rights Request” or SRR and “Privacy Rights Requests”.

Depending on the law, data subjects (which can be consumers and, in the case of GDPR, employees) may have the right to:

  • Access the data that your company has collected on them and/ or the categories of data collected
  • Delete the personal data that companies have collected
  • Correct the data
  • Opt out of the sale of personal data
  • Opt out of data processing
  • Port personal data

They exercise these rights via Data Subject Access Requests also known as DSAR requests or simply DSARs.


Table of Contents

What are the DSAR requirements?
What do I need for DSAR compliance?
Additional Resources


What are the DSAR requirements?

DSAR rules and requirements

Multiple trading blocs (EU with GDPR), countries (such as Brazil and China) and States have data privacy laws that outline Data Subject Access Requests requirements. Each one can require different access to different people.

Whether or not you have to fulfill DSARs depends on:

  • Where you do business
  • The size of your business
  • The type of business
  • How you are using personal data
  • Geographical location of where the data is stored
  • Geographical location of the people whose data is stored

For GDPR, CCPA/ CPRA (California), CDPA (Virginia) and CPA (Colorado), companies must comply if they are:

CCPA/CPRA (California)

For-profit entities that collect personal information from California residents and meet any of the following thresholds:

  • At least $25 million in gross annual revenue;
  • Buys, sells or receives personal information about at least 50,000 CA consumers, householders or devices for commercial purposes or*;
  • Derives more than 50% of its annual revenue from the sale of personal information.*

*When CPRA goes into affect in January 1, 2023:
(ii) above is replaced with “buys, sells or shares personal information of 100,000 or more California residents or households”
(iii) above is replaced with “derives 50% or more of annual revenue from selling or sharing California personal information.

CDPA (Virginia)

For-profit entities that conduct business in Virginia or offer products or services targeted to residents in Virginia and:

  • Control or process the data of at least 100,000 consumers or;
  • Control or process the data of at least 25,000 consumers and derive more than 50% of revenue from the sale of personal data.

CDA (Colorado)

Legal entities that:

  • Conduct business or produce products or services that are intentionally targeted to Colorado residents and;
  • Either control or process personal data of more than 100,000 consumers per calendar year or;
  • Derive revenue or receive a discount on the price of goods or services from the sale of personal data and control or process the personal data of at least 25,000 consumers.

For more information on which privacy laws may apply to your business, take a look at our interactive privacy law table.

Once you know which laws are applicable to your business, you need to know which privacy rights can be requested and who can make a DSAR request.

As shown above, depending on the jurisdiction, your consumers will be able to send different types of DSAR requests that you will have to collect, verify, fulfill and store. Although the response time is consistent for US State Privacy Laws (45 days and one 45 day extension totaling 90 days), it is important to note that both CCPA and CPRA require a 15 day response time for opt-out requests.

For more information on other laws, please check our privacy law matrix.

What do I need for DSAR compliance?

Understand the definitions of personal data

Each regulation has its own definition of what personal data constitutes. While they all follow the same approach in protecting key personal information, there are divergences in the details.

In the case of an access request, your organization has to provide all the personal data back to the consumer. This means that you need to have a clear understanding of the definition of “personal data” for each regulation and have this documented internally.

An access request does not require you to compile and provide every single data point you have on a given data subject. You only need to deliver the personal data you own about the consumer.

Linking the personal data definitions to your own data processes is the foundation of any robust DSAR program as it will allow you to:

  • Gain a holistic view of the personal data your organization collects
  • Spot potential risks or redundancies early
  • Recommend operational data management efficiencies to your organization
  • Become a trusted partner to the business functions
  • Build efficient DSAR operations by focusing on the relevant requirements per regulations
  • Scale your entire privacy program as regulations, people, systems and processes change
  • Deliver fast, complete and structured responses to your privacy-centric consumers

Helpful Tip:

This important exercise will not only enable your team to be better prepared and more efficient but will also position the privacy team as the team with the most up-to-date overall understanding of the data flows at your organization. Teams such as Data Science, Analytics, Business Intelligence, Operations, Product, or even software developers will likely want to use your data intelligence for their own projects. It is a great way to leverage what you have built to support other business outcomes, re-emphasize the importance of data privacy, and identify your privacy champions.

To better understand how the definitions of personal data differ across privacy regulations, check out this blog post covering personal data under CCPA and CPRA, GDPR, Virginia CDPA, and Colorado CPA.

Plan your DSAR operations ahead

Before you begin to think about DSAR fulfillment, you have to first assess the data situation at your company and gain a good understanding of the nature, location, and flow of your data.

  1. What type of data are you collecting or observing? Having a clear understanding of the personal data, anonymized data and public data you have and collect in your systems is a crucial step to speed up your entire privacy operations.
  2. Where is the data stored in your organization? Identify and map where the personal data is held in your organization and identify the correct owner.
  3. Where does your company send or store personal data outside of your organization? Deletion requests may involve not only team members around your organization, but also all external vendors and partners with whom you shared the personal information.
  4. How is the data being used? Virginia CDPA and Colorado CPA have a duty for controllers to avoid secondary use of personal data. Understanding how the data is being used internally is a key step in building your data intelligence.
  5. What are your protocols surrounding personal data management? Regulations like GDPR or CPRA include data minimization and retention principles that will push companies to think about how they handle data internally.
  6. Who are the team members who will help to fulfill any requests? Systems owners, IT and your legal team will likely be the team you rely on to fulfill these requests.

Develop a process to fulfill DSAR requests

Robust DSAR operations start with understanding the key steps in the lifecycle of a DSAR and some of the challenges you will likely encounter.

A standard DSAR process can be broken down into 5 steps:

  • Intake: Collect the information from the requestor. You need to collect just enough information so that you can identify them.
  • Process: Once a request comes in, it needs to be validated. Once it’s validated the request needs to be placed in a queue for fulfillment.
  • Fulfillment: All of the data related to the request has to be collected from throughout the company. This can be done through manual routing or through automated integrations.
  • Delivery: The information has to be delivered to the request in a secure way
  • Reporting: Your team will need to track key DSAR operations metrics and set up dashboards or prepare regular reports for your leadership team.

Understand the key operational challenges and risks in your DSAR operations

You now have a good understanding of your data and understand the end-to-end process of DSAR requests. Every step will bring its sets of challenges that may be more or less acute depending on your organization.

Here are some of the potential challenges you may face.

At the intake step

  • Easy submission: Make it easy for your consumers to exercise their rights – if you offer them multiple options (form, email, phone, store), you will need a way to centralize these requests.
  • Identity verification: Ask enough information upfront to your unauthenticated users or non-account holders so you can validate their identity. But keep in mind that CCPA, for example, has guidelines on the information you should ask based on the request type (§ 999.323 and § 999.325). In addition, the 45-day deadline for the response starts when a business receives a request, regardless of when they verify the request. So you want to make this process efficient so you have enough time.
  • Receipt confirmation: Make sure you confirm receipt of the request promptly. Not only is it the appropriate thing to do for your consumers but CCPA requires that businesses send a confirmation of receipt within 10 days of an access or deletion submission(§ 999.313 (a))

At the process step

  • Duplicate requests: You may receive “excessive” requests from some consumers. If you have already fulfilled 2 access requests in the past 12 months for the same requestor, you are not obligated to fulfill their additional requests. But you need a system to track and document these duplicate requests for audit purposes. (§ 1798.145 (i)-3)
  • Valid requests: While you could decide to fulfill all your US requests following the CCPA regulation, you may not have the operations ready for that volume. Validating rapidly the geography of the requestor and whether you own any data on them could be a challenge but will be important if you want to triage your requests efficiently.
  • Leverage automation: Manually copy-pasting the DSAR information into an excel spreadsheet is time-consuming and prone to errors. Triggering automated workflows when a valid DSAR comes through will make your life much easier.

At the fulfillment step

  • Data retrieval: Gathering, cleaning, and packaging the relevant information about a data subject from all your systems can be very complex if you do it manually.
  • Workflow: Once a request is at the fulfillment step, sending out manual reminders can be a tedious and time-consuming task for the privacy team.
  • Data redaction: Depending on your processes or industry, data redaction may be required. Unless you have the right tool, this could be manual and insecure.
  • Legal review: If your DSAR operations include a review by your legal team, make sure you provide enough time to your legal colleagues and keep an eye on the due date.

At the delivery step

  • Response time: Are you still within the timeframe? Do you need to request an extension? Automated reminders and messages can save you a lot of time.
  • Secured delivery: Delivering personal information via email for access requests is not recommended. Encrypted mechanisms should be preferred.
  • Open communication: It is possible your consumers will have questions about the data you delivered. Do you provide them an easy way to communicate with your team and ask questions?

At the reporting step

  • Audit log: You need a central place to store the logs of your requests for audit purposes.
  • DSAR dashboard: Can you quickly see your DSAR volume, your fulfillment rate, and your response time by requests? Having a dashboard to quickly spot trends or issues and assess your performance could be very helpful.

Compiling your DSAR metrics

Tracking and analyzing your DSAR metrics is crucial to understand your DSAR operations and detect potential issues. It will help you answer questions such as:

  • How many of our DSARs are actually coming from California residents?
  • Where do people drop off in their DSAR submission process
  • How fast do we fulfill deletion requests?
  • What is our average monthly volume of access requests?
  • Are we requesting fewer extensions than last year?
  • For more insights and tips on DSAR metrics, please refer to our guide “DSAR Metrics: What Should You Be Measuring?” Compiling your DSAR metrics is not only the right thing to do for a privacy team but it can also be mandatory for some of you. Indeed, CCPA requires companies buying, receiving, sharing or selling the personal data of 10 million or more consumers (California residents) to compile a standard set of metrics. These companies need to compile and publish by July 1 every year, the following metrics for the previous calendar year:The number of requests to know that the business received, complied with in whole or in part, and denied;The number of requests to delete that the business received, complied with in whole or in part, and denied;
  • The number of requests to opt-out that the business received, complied with in whole or in part, and denied; and
  • The median or mean number of days within which the business substantively responded to requests to know, requests to delete, and requests to opt-out.

Plan ahead. Develop a process to compile your key DSAR metrics. Analyze the metrics regularly. Make improvements on your DSAR operations.

The team at WireWheel has reviewed over 1,000 US websites and analyzed 2020 DSAR metrics for Fortune 1000 companies and Data Brokers. Want to see what we found? Curious to see how you compare?

Check our report here!

Business Impacts Around Data Subject Access Requests

Without the right solution to help, managing DSAR can be very challenging and costly. Gartner estimates that a DSAR could cost about $1,400, if done manually but there are also potential indirect costs that could increase your DSAR fulfillment costs.

How will DSARs impact your operations

$1,400

Average cost of processing a DSAR

46%

of all complaints made to the Information Commissioner’s Office (ICO) in the UK were about DSARs and the difficulties people face when trying to get hold of their personal information

  • Non-compliance can lead to consumer complaints and potential fines.
  • Fraudulent requests can result in a breach causing reputational damage and potential costs.
  • Delivering the information to the wrong person can be costly.
  • Your brand reputation can be hurt by a bad experience and that could impact NPS and revenues
  • Cost to deliver a DSAR request will grow if there are poor internal processes and data mapping.
  • Manual work for the privacy team to track the status of different requests.
  • Time consuming for stakeholders to retrieve, package and send the privacy team the relevant consumer data.

Potential DSAR Solutions: Build vs. Buy

Companies must have a secure way to accept and deliver requests and a way to manage the workflows. They have two options:

  • Build a DSAR Management Solution (In-house)
    Some companies may have decided to build in-house solutions to securely accept requests and deliver information back to their consumers. The downside of this approach is it requires the team that built the tool (typically IT) in-house to maintain it and to add functionality as laws change or to make any changes that would improve response rate. The IT team may also be the one building any integrations and automation from scratch.
  • Buy a DSAR Management Solution
    There are multiple vendors, including WireWheel, who offer a DSAR solution. Privacy technology companies are 100% focused on ensuring that companies stay compliant and manage their DSAR operations efficiently.

As WireWheel customers’ have said:

The StockX fundamental thought process has always been can we build it versus buying it…. As we were evaluating what was going on, I don’t think we truly had a full appreciation for the scope of work that would come with the implementation of these privacy laws – whether it was GDPR or CCPA – and the number of requests and the number of customers who would actually leverage the legislation to [exercise their rights]. There was just a lot of analysis required.

– Monica Gaffney StockX

“WireWheel is going to allow us to grow over time, allow us to add functionality, and expand our processing capabilities,” he says. “Let’s say that in the next year, five new states have privacy guidelines—we know that WireWheel is going to allow us to open up to those states to be able to process customer data.

– Paul Branco, Shoes.com

5 Tips for managing DSAR:

  1. Know where your data is: It will make your entire DSAR operations so much easier.
  2. Verify the data subject’s identity: Understand what is needed to verify both your account holders and non-account holders, follow the legal guidelines and automate the verification process.
  3. Assign responsibility for fulfillment: Your access and deletion requests will likely have different owners. Route DSARs automatically to the right person based on request types or jurisdiction.
  4. Monitor DSAR status closely: If you operate in multiple jurisdictions, you will likely face different deadlines. Leverage reminders, emails and flag high priorities to meet your deadlines.
  5. Compile and improve your DSAR metrics: Get an aggregate view of your DSAR performance to identify bottlenecks, develop your own DSAR roadmap and focus on your priorities.

Additional Resources

  1. Access best practices to manage your DSAR operations in this Ultimate DSAR guide.
  2. Compare personal data definitions and DSAR requirements across regulations with our Interactive Privacy Table.
  • Privacy Law Update

Privacy Law Update: October 4, 2021

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


Newsworthy Updates

The Downside to State and Local Privacy Regulations

Are stricter privacy regulations a good thing? As more state and local governments look to protect data privacy, a couple of industry experts point out some of the challenges associated with these types of policies.

Connecticut Law Concerning Data Breach Notification Will Change

Effective October 1, 2021, Connecticut law concerning data breach notification will change. Conn. Gen. Stat. § 36a-701b, passed in 2012, established the notification requirements for business and protections for consumers when a “breach of security” occurs. Now, in an effort to further protect consumers, the Connecticut legislature expanded the reach of the data breach notification statute with PA 21-59.

Senate Committee Hearing To Focus On Protecting Consumer Privacy

U.S. Senator Maria Cantwell (D-WA), Chair of the Senate Committee on Commerce, Science, and Transportation convened a hearing titled “Protecting Consumer Privacy” at 10:00 a.m. on Wednesday, September 29, 2021. This hearing examined how to better safeguard consumer privacy rights, including equipping the Federal Trade Commission with the resources it needs to protect consumer privacy through the creation of a privacy bureau; and the need for a comprehensive federal privacy law.

Federal Trade Commission Hosts Panels Related to Consumer Privacy and Data Security at PrivacyCon

This summer, the Federal Trade Commission (“FTC”) hosted its sixth annual PrivacyCon, an event focused on the latest research and trends related to consumer privacy and data security. This years’ event was divided into six panels: Algorithms; Privacy Considerations and Understandings; Adtech; Internet of Things; Privacy-Children and Teens; and, Privacy and the Pandemic. Welcoming attendees and kicking off the event, Commissioner Rebecca Kelly Slaughter called for minimization of data abuses and for a move away from the notice and consent model of privacy in favor of data minimization. PrivacyCon topics are selected by the FTC and often seen as an indication of enforcement priorities.

The New Era Of Data-Driven Marketing Requires Collaboration And Trust | Sponsored Content

The new realities of identity are forcing marketers to rethink practices that have been in place for over 20 years. Industry, regulatory, and technology trends are fundamentally shifting the way that data is permissioned, accessed, and used for marketing purposes. These changes are driven by consumer demand for more transparency around how their data is collected, used, and managed.

However, marketers must answer the call. The industry has an opportunity to build a more effective advertising framework that puts consumers and data privacy at the center. Marketers can still deliver personalized experiences and delight customers, but they must look at new and innovative approaches.

How do we do that? Where do we start? To answer these questions, Tapad, a part of Experian, commissioned Forrester Consulting to evaluate the current state of customer data-driven marketing.

Could Data Privacy Be Retail’s New Competitive Differentiator?

Some say that in the new era of retail, data is gold. For others, data is oil. Whatever valuable commodity data represents, it’s key for retailers in their quest to drive innovation and differentiation in vital areas including marketing, customer experience and product development.  But new laws, regulations and tech policies are forcing businesses to completely rethink how they collect and use data. Additionally, consumers have become more aware of companies’ data collection practices and have actively responded to personalization practices they perceive as shady, intrusive or both.

U.S. Needs To Work With Europe To Slow China’s Innovation Rate

Commerce Secretary Gina Raimondo said Tuesday that the U.S. will rally allies in order to mount pressure on China, the world’s second-largest economy, an approach that differs from the “America First” policies pursued by President Joe Biden’s Republican predecessor, Donald Trump.

“America is most effective when we work with our allies,” Raimondo told CNBC’s Kayla Tausche in an exclusive interview. “If we really want to slow down China’s rate of innovation, we need to work with Europe.”

Privacy Legislation

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy

How to Operationalize Privacy by Design

I think I’m a closet Project Manager. When you operationalize concepts, you have to have some solid project management skills to ensure you’re identifying everything that you want to accomplish; how you’re going to do it; and the tactical plan behind it.

—Lisa Barksdale, Director of Privacy, Zillow

Privacy by Design is increasingly recognized as the pathway to achieving regulatory compliance, creating business value, and enhancing the consumer experience. Resistance, inasmuch as it existed, has given way to, okay, how do we do this? How do we take privacy by design from C-Suite concept to operational reality?

Elucidating the how, Zillow’s Director of Privacy, Lisa Barksdale, joined WireWheel CEO Justin Antonipillai, Yahoo AGC Katherine Pimentel, and Yahoo Legal Services Senior Manager, Tara Jones, at the breakout session “Rise of the Privacy Operations Leader” at the IAPP Privacy. Security. Risk. 2021 Conference.

Ms. Barksdale kindly spoke with us to offer a small preview of her planned presentation on successfully operationalizing privacy by design.

Gap Analysis and Framework

“It’s really important to first do a risk and gap analysis. You want to understand what already exists and what doesn’t. Once you figure that out, you begin a triage,” says Lisa.

Ms. Barksdale suggests breaking down the various aspects of privacy operations into “buckets” to better enable tactical focus. For example governance routines, intake and assessment processes, and reporting and metrics.

Importantly,

There’s no one size that fits all. It depends on what industry you’re in and what you’re looking to determine. Then, once you identify those buckets, you figure out, okay, what, in your utopia, would they look like versus what the organization is actually doing.

Having completed the gap analysis and risk assessment, the next step is to establish your operational framework.

Framework and Guidance

You then want to determine what your overall operating framework looks like. I’m a firm believer in documentation. Having written policy and procedures provides guidance for everyone and it’s a crystal-clear pathway to where you want to go.

Without documentation, I feel like everyone’s just running around trying to figure out what they’re doing.

Having established the buckets of operation and guidance, “you then also want to identify who your key partners are,” suggests Lisa. And you need to do this early in the process “because as you’re building out your documentation, there could be areas where you need to bring them in to make sure you are aligned across the enterprise.”

Inclusivity is very important, offers Lisa. “And this means training and education around, not only what we’re doing, but how we’re doing it, and what role our key partners play in it.”

And the most effective way to inculcate privacy in the business and routinize privacy-by-design with effect is to identify privacy champions within the business units themselves.

Privacy Champions

Ms. Barksdale suggests that operationalizing privacy by design “is really about behavior change. Understanding what those challenges are and being able to get over those hurdles.”

Identifying privacy champions can be challenging. “It is a huge change,” she says. “I’m asking the business to identify a resource that doesn’t necessarily report into the privacy office but who will be spending a considerable amount of time working through privacy risk and being that liaison for us.”

So, what makes for a good privacy champion?

I think what makes a really good privacy champion is someone that knows their business. They know their business operations and they’re curious about privacy. They want to understand what it is and how it impacts their businesses.

Education and training is key to privacy champion success opines Lisa. Ms. Barksdale does not only ensure education around the various privacy operations, but also has training programs on facilitating engagement and the liaison skills requisite to a successful privacy champion role. “We’re also educating the businesses to make sure that they’re communicating with their champions in their various areas,” says Lisa.

And, of course, part of good communication is being a good listener:

As a privacy leader, you need to be open and not critical of the spaces you operate in. You need to understand your business and how it functions, and what privacy by design means for them: then you can start to plug it in.

Be curious and be open minded about how to implement it. And be a great listener. You learn a lot listening to your businesses that will help ensure your program is best in class.

Key Takeaways

1. First Perform a Gap Analysis

  • Identify Risk
  • Determine Current State vs Desired State
  • Differentiate between “must have” and “nice to have”

2. Define an Operational Framework and Guidance

  • Establish Operational Buckets
  • Written guidance (policy and procedure) is critical!
  • Identify your Partners, and
  • Think like a Project Manager

3. Identify your Privacy Champions

  • Educate and Train, and
  • Appreciate the Time Commitment Required

4. Education and Training

  • Privacy Training for the Business
  • Liaison Skills for Champions
  • Strong Communication Protocols

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy Law Update

Privacy Law Update: September 27, 2021

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


Newsworthy Updates

Mixed Messaging Around Progress Toward EU-US Data Transfer Solution

Politico reports the future of EU-U.S. data transfers is unclear based on different messaging from the parties involved. While U.S. officials continue speaking positively about finding a data transfer solution, EU officials are on record as saying a deal is not close and is far from certain to be closed by year’s end. U.S. national security laws and agencies’ ability to access personal data remain the biggest sticking point from the EU perspective.

California Privacy Agency Seeks Comments On Proposed CPRA Rulemaking

The California Privacy Protection Agency is welcoming comments on proposed rulemaking under the California Privacy Rights Act. The CPPA is particularly interested in comments on data processing that presents significant risks to consumers’ privacy, automated decision making and consumers’ right to manage their personal information. “Comments will assist the Agency in developing new regulations, determining whether changes to existing regulations are necessary, and achieving the law’s regulatory objectives in the most effective manner,” the agency said. The deadline to submit comments is Nov. 8.

Brazilian Government Launches Data Protection Guide

Brazil’s government launched a data protection guide to promote awareness with the general public, ZDNet reports. Created in cooperation with the national data protection authority, the report details the rights of data users, including the right to opt out, how to protect personal information and what steps to take if they have been involved in a data breach. The report also outlines steps organizations “should act in relation to personal data.

UK ICO’s Denham ‘optimistic’ About Global Data Convergence

In her five years as U.K. information commissioner, and as chair of the Global Privacy Assembly, Elizabeth Denham said she has never been so “optimistic” about international data convergence as she is today. During a forum hosted by Sidley Austin Partners Alan Charles Raul and William Long, Denham talked of a global “convergence of ideas” on data protection, achieving a “21st century solution for data flows,” and more, IAPP Staff Writer Jennifer Bryant reports. “I’ve never been so optimistic about the world coming together,” Denham said.

Senate Democrats Call On FTC To Fix Data Privacy ‘Crisis’

Senate Democrats are calling on the Federal Trade Commission to write new rules to protect consumer data privacy in a new letter to the agency authored on Monday.  The letter, led by Sen. Richard Blumenthal (D-CT) and signed by eight other Democratic senators, was sent to FTC Chair Lina Khan Monday, calling on the agency to “begin a rulemaking process” on privacy. Specifically, the senators are requesting that the FTC pen new rules addressing privacy, civil rights, and the collection of consumer data.

Privacy Legislation

No substantive legislative updates this week.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Regulations

Canada’s Anti-Spam Law (CASL): What you need to know

Introduction

The Canada Anti-Spam Law (CASL) was introduced in 2010 but came into effect on July 1, 2014. The CASL’s primary purpose is to reduce “the harmful effects of spam and related threats” and “help create a safer and more secure online marketplace”, as per the enforcement agency website.

The CASL is a comprehensive data privacy law created to combat spam and prevents organizations, including foreign ones, from sending unsolicited or misleading commercial electronic messages (“CEM”) or programs to consumers without their consent.

Official text

Click here to access the full official text of the CASL.

Effective Date

The CASL went into effect on July 1, 2014.

Applicability

The regulation applies to any “Commercial Electronic Message” (CEM) sent from or to Canadian computers or devices in Canada. Messages routed through Canadian computer systems are not subject to this law.

A CEM is any message that:

  • Is in an electronic format, including emails, instant messages, text messages, and some social media communications,
  • Is sent to an electronic address, including email addresses, instant message accounts, phone accounts, and social media accounts, and
  • Contains a message encouraging recipients to take part in some type of commercial activity, including the promotion of products, services, people/personas, companies, or organizations.

Fax messages and fax numbers aren’t considered electronic formats or addresses under CASL.

Privacy Notice

The CASL does not expressly require businesses to display a privacy notice.

Consumer Rights

Under this privacy framework, express or implied consent is required for all use of data.

Data Protection Assessments

Data protection assessments are currently not required.

Enforcement

The regulation is enforced by the Canadian Radio-television and Telecommunications Commission (CRTC). The CRTC is Canada’s broadcasting and telecommunications regulator and the primary enforcement agency for CASL.

Private Right of Action

The CASL does not have a provision for private rights of action.

Penalties and Damages

A violation under the CASL may require the violator to pay an administrative monetary penalty (AMP). The maximum amount of an AMP, per violation, for an individual is $1 million. For a business, it is $10 million. The CASL sets out a list of factors considered in determining the AMP’s amount.

Cure Period

The CASL does not have a cure period.

Exemptions

The following types of CEMs are exempt from CASL for various reasons. Please note that not all of these are allowable under Mailchimp’s Standard Terms of Use.

  • Messages sent by or on behalf of an individual to another individual whom they have a family or personal relationship,
  • Messages sent to an employee or consultant of your business or another organization with whom your organization has a relationship,
  • Messages sent in response to a request, inquiry, or complaint or that is otherwise solicited by the recipient,
  • Messages that will be accessed in a foreign country, including the U.S., China, and most of Europe, as long as the message complies with the anti-spam laws of that foreign country,
  • Messages sent by or on behalf of a registered charity or a political party or organization for the purposes of raising funds or soliciting contributions,
  • Messages sent to a person to satisfy a legal obligation, provide notice of an existing or pending right, legal, or juridical obligation, court order, or to enforce a legal right, juridical order, or court order.

The CASL also contains an exception to the consent requirement for certain types of transactional messages. These messages still comply with CASL’s message content and unsubscribe requirements. Transactional messages include CEMs that solely:

  • Provide warranty, recall, safety, or security information about a product or service purchased by the recipient, 
  • Provide notification or factual information about a purchase, subscription, membership, account, loan, or other ongoing relationship, including delivery of product updates or upgrades,
  • Facilitate, complete, or confirm a commercial transaction that the recipient previously agreed to enter,
  • Provide a quote or estimate for the supply of a product, good, or service.

Data Breach

Under the CASL, organizations having personal information under their control must, without unreasonable delay, provide notice to the Commissioner of any incident involving the loss of or unauthorized access to or disclosure of personal information where a reasonable person would consider that there exists a real risk of significant harm to an individual as a result.

Notification to the Commissioner must be in writing and include:

  • A description of the circumstances of the loss or unauthorized access or disclosure,
  • The date or time period during which the loss or unauthorized access or disclosure occurred,
  • A description of the personal information involved in the loss or unauthorized access or disclosure,
  • An assessment of the risk of harm to individuals as a result of the loss or unauthorized access or disclosure,
  • An estimate of the number of individuals to whom there is a real risk of significant harm as a result of the loss or unauthorized access or disclosure,
  • A description of any steps the organization has taken to reduce the risk of harm to individuals,
  • A description of any steps the organization has taken to notify individuals of the loss or unauthorized access or disclosure, and
  • The name and contact information for a person who can answer, on behalf of the organization, the Commissioner’s questions about the loss of unauthorized access or disclosure.

See how the CASL compares to other privacy regulations such as GDPR, CCPA, and more on our interactive privacy table

Compare Now
  • Regulations

Colorado Privacy Act (CPA): What You Need to Know

Introduction

The Colorado Privacy Act (CPA) was introduced on March 19, 2021, unanimously passed on May 26, 2021 and was signed into law on July 7, 2021 by Governor Jared Polis.

CPA became the third comprehensive data privacy law adopted in the US, after California with CCPA and CPRA and after Virginia with CDPA.

The key differences between the CPA and CCPA revolve around the private rights of action, the enforcement, penalties, or the cure period.

Official text

Click here to access the full official text of the CPA.

Effective Date

The CPA is scheduled to go into effect on July 1, 2023.

Applicability

The CPA currently applies to legal entities that:

(a) conduct business or produce products or services that are intentionally targeted to Colorado residents and

(b) either (i) control or process personal data of more than 100,000 consumers per calendar year or

(ii) derive revenue or receive a discount on the price of goods or services from the sale of personal data and control or process the personal data of at least 25,000 consumers.

Covered Personal Information

The CPA defines “Personal Data” as “information that is linked or reasonably linkable to an identified or identifiable individual,” with the exceptions of:

(a) de-identified data and

(b) publicly available information.

Sensitive Data

Under this data privacy law, a controller must not process sensitive data concerning a consumer without obtaining the consumer’s consent or, in the case of processing of personal data concerning a known child or student, without obtaining consent from the child’s or student’s parent or lawful guardian. SB 21-190 defines “sensitive data” as:

(i) personal data revealing racial or ethnic origin, religious beliefs, a mental or physical health condition or diagnosis, sex life or sexual orientation, or citizenship or citizenship status,

(ii) genetic or biometric data that may be processed for the purpose of uniquely identifying an individual, or

(iii) personal data from a known child.

Anonymous, De-identified, Pseudonymous, or Aggregated Data

Under the Colorado Privacy Act, “de-identified data” means data that do not identify an individual with respect to which there is no reasonable basis to believe that the information can be used to identify an individual.

Children

A controller must not process sensitive data concerning a consumer without obtaining the consumer’s consent or, in the case of processing of personal data concerning a known child or student, without obtaining consent from the child’s or student’s parent or lawful guardian. The CPA defines “sensitive data” as:

(i) personal data revealing racial or ethnic origin, religious beliefs, a mental or physical health condition or diagnosis, sex life or sexual orientation, or citizenship or citizenship status,

(ii) genetic or biometric data that may be processed for the purpose of uniquely identifying an individual, or

(iii) personal data from a known child.

Privacy Notice

This privacy framework introduces a duty of transparency for controllers. The controller must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:

  1. The categories of personal data collected or processed by the controller or a processor,
  2. The purposes for which the categories of personal data are processed,
  3. An estimate of how long the controller may or will maintain the consumer’s personal data,
  4. An explanation of how and where consumers may exercise their rights under SB 21-190,
  5. The categories of personal data that the controller shares with third parties, if any, and
  6. The categories of third parties, if any, with whom the controller shares personal data.

Consumer Rights

The CPA introduces the following consumer rights:

  1. Right to opt-out of the processing of personal data concerning the consumer,
  2. Right to access the consumer’s personal data and confirm whether a controller is processing personal data concerning the consumer,
  3. Right to correct inaccurate personal data collected from the consumer,
  4. Right to delete personal data concerning the consumer,
  5. Right to obtain the consumer’s personal data in a portable and readily usable format up to two times per calendar year.

Contracting

The Colorado Privacy Act defines the “duties of Controllers”. Similar to preceding data privacy legislation, SB 21-190 utilizes concepts of data “controllers” and data “processors,” where a “controller” is the person or entity that determines the purposes and means of processing personal data and the “processor” is the person or entity that processes personal data on behalf of the controller. Controllers and processors must enter into a binding contract governing the processing instructions. Controllers do not avoid responsibility by delegating processing responsibilities to a processor.

Data Protection Assessments

Under the CPA, before engaging in processing that presents a heightened risk of harm to a consumer, a controller must conduct and document a data protection assessment of each of its processing activities that involves personal data acquired on or after the effective date of SB 21-190. SB 21-190 defines “processing that presents a heightened risk of harm to a consumer” as including the following:

(i) processing personal data for purposes of targeted advertising or profiling,

(ii) selling personal data, and

(iii) processing sensitive data.

Enforcement

The CPA will be enforced by the Attorney General of Colorado and District Attorneys.

Private Right of Action

The Colorado Privacy Act does not contain a provision for private rights of action.

Penalties and Damages

Under the CPA, violations would be subject to civil penalties under the Colorado Consumer Protection Act (C.R.S. 6-1-112), which provides for civil penalties of not more than $20,000 per violation.

Cure Period

The CPA establishes a right to cure period of 60 days. This cure period will be repealed on January 1, 2025.

Exemptions

The SB 21-190 currently does not apply to certain categories of personal data already governed by various state and federal laws, such as HIPAA, the Gramm-Leach-Bliley Act (GLBA), Fair Credit Reporting Act, Driver’s Privacy Protection Act of 1994, Children’s Online Privacy Protection Act of 1998 (COPPA), Family Educational Rights and Privacy Act of 1974 (FERPA), in each case to the extent the activity related to the personal data is in compliance with such existing governing law(s). SB 21-190 also does not apply to data maintained for employment records purposes. If a business processes personal data pursuant to an exemption under SB 21-190, the business bears the burden of demonstrating that the processing qualifies for the exemption.

Data Breach

The CPA requires notification of security breaches affecting personal information (PI), which includes a detailed notice to Colorado residents and, in certain circumstances, a notice to the Attorney General.

Introduction

The Colorado Privacy Act (CPA) was introduced on March 19, 2021, unanimously passed on May 26, 2021 and was signed into law on July 7, 2021 by Governor Jared Polis.

CPA became the third comprehensive data privacy law adopted in the US, after California with CCPA and CPRA and after Virginia with CDPA.

The key differences between the CPA and CCPA revolve around the private rights of action, the enforcement, penalties, or the cure period.

Official text

Click here to access the full official text of the CPA.

Effective Date

The CPA is scheduled to go into effect on July 1, 2023.

Applicability

The CPA currently applies to legal entities that:

(a) conduct business or produce products or services that are intentionally targeted to Colorado residents and

(b) either (i) control or process personal data of more than 100,000 consumers per calendar year or

(ii) derive revenue or receive a discount on the price of goods or services from the sale of personal data and control or process the personal data of at least 25,000 consumers.

Covered Personal Information

The CPA defines “Personal Data” as “information that is linked or reasonably linkable to an identified or identifiable individual,” with the exceptions of:

(a) de-identified data and

(b) publicly available information.

Sensitive Data

Under this data privacy law, a controller must not process sensitive data concerning a consumer without obtaining the consumer’s consent or, in the case of processing of personal data concerning a known child or student, without obtaining consent from the child’s or student’s parent or lawful guardian. SB 21-190 defines “sensitive data” as:

(i) personal data revealing racial or ethnic origin, religious beliefs, a mental or physical health condition or diagnosis, sex life or sexual orientation, or citizenship or citizenship status,

(ii) genetic or biometric data that may be processed for the purpose of uniquely identifying an individual, or

(iii) personal data from a known child.

Anonymous, De-identified, Pseudonymous, or Aggregated Data

Under the Colorado Privacy Act, “de-identified data” means data that do not identify an individual with respect to which there is no reasonable basis to believe that the information can be used to identify an individual.

Children

A controller must not process sensitive data concerning a consumer without obtaining the consumer’s consent or, in the case of processing of personal data concerning a known child or student, without obtaining consent from the child’s or student’s parent or lawful guardian. The CPA defines “sensitive data” as:

(i) personal data revealing racial or ethnic origin, religious beliefs, a mental or physical health condition or diagnosis, sex life or sexual orientation, or citizenship or citizenship status,

(ii) genetic or biometric data that may be processed for the purpose of uniquely identifying an individual, or

(iii) personal data from a known child.

Privacy Notice

This privacy framework introduces a duty of transparency for controllers. The controller must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:

  1. The categories of personal data collected or processed by the controller or a processor,
  2. The purposes for which the categories of personal data are processed,
  3. An estimate of how long the controller may or will maintain the consumer’s personal data,
  4. An explanation of how and where consumers may exercise their rights under SB 21-190,
  5. The categories of personal data that the controller shares with third parties, if any, and
  6. The categories of third parties, if any, with whom the controller shares personal data.

Consumer Rights

The CPA introduces the following consumer rights:

  1. Right to opt-out of the processing of personal data concerning the consumer,
  2. Right to access the consumer’s personal data and confirm whether a controller is processing personal data concerning the consumer,
  3. Right to correct inaccurate personal data collected from the consumer,
  4. Right to delete personal data concerning the consumer,
  5. Right to obtain the consumer’s personal data in a portable and readily usable format up to two times per calendar year.

Contracting

The Colorado Privacy Act defines the “duties of Controllers”. Similar to preceding data privacy legislation, SB 21-190 utilizes concepts of data “controllers” and data “processors,” where a “controller” is the person or entity that determines the purposes and means of processing personal data and the “processor” is the person or entity that processes personal data on behalf of the controller. Controllers and processors must enter into a binding contract governing the processing instructions. Controllers do not avoid responsibility by delegating processing responsibilities to a processor.

Data Protection Assessments

Under the CPA, before engaging in processing that presents a heightened risk of harm to a consumer, a controller must conduct and document a data protection assessment of each of its processing activities that involves personal data acquired on or after the effective date of SB 21-190. SB 21-190 defines “processing that presents a heightened risk of harm to a consumer” as including the following:

(i) processing personal data for purposes of targeted advertising or profiling,

(ii) selling personal data, and

(iii) processing sensitive data.

Enforcement

The CPA will be enforced by the Attorney General of Colorado and District Attorneys.

Private Right of Action

The Colorado Privacy Act does not contain a provision for private rights of action.

Penalties and Damages

Under the CPA, violations would be subject to civil penalties under the Colorado Consumer Protection Act (C.R.S. 6-1-112), which provides for civil penalties of not more than $20,000 per violation.

Cure Period

The CPA establishes a right to cure period of 60 days. This cure period will be repealed on January 1, 2025.

Exemptions

The SB 21-190 currently does not apply to certain categories of personal data already governed by various state and federal laws, such as HIPAA, the Gramm-Leach-Bliley Act (GLBA), Fair Credit Reporting Act, Driver’s Privacy Protection Act of 1994, Children’s Online Privacy Protection Act of 1998 (COPPA), Family Educational Rights and Privacy Act of 1974 (FERPA), in each case to the extent the activity related to the personal data is in compliance with such existing governing law(s). SB 21-190 also does not apply to data maintained for employment records purposes. If a business processes personal data pursuant to an exemption under SB 21-190, the business bears the burden of demonstrating that the processing qualifies for the exemption.

Data Breach

The CPA requires notification of security breaches affecting personal information (PI), which includes a detailed notice to Colorado residents and, in certain circumstances, a notice to the Attorney General.

See how the Colorado CPA compares to other global privacy regulations such as CCPA, GDPR, and more on our Privacy Law Matrix.

Compare
  • Regulations

Complete Guide to GDPR: General Data Protection Regulation

Introduction

The General Data Protection Regulation (GDPR) was adopted on April 14, 2016 and went into effect on May 25, 2018. The GDPR governs data protection and privacy in the European Union and in the European Economic Activity (EEA).

The GDPR’s primary aim is to enhance individuals’ control and rights over their personal data and to simplify the regulatory environment for international business.

The GDPR was the first comprehensive data privacy law and has inspired other legislations around the world from the California Consumer Privacy Act (CCPA) to Brazil’s Brazil’s Lei Geral de Proteção de Dados (LGPD).

Official text

Click here to access the full official text of the GDPR.

Effective Date

The GDPR went into effect on May 25, 2018.

Applicability

The GDPR applies to both Data Controllers and Data Processors:

  • Established in the EU that process personal data in the context of activities of the EU establishment, regardless of whether the data processing takes place within the EU,
  • Not established in the EU that process EU data subjects’ personal data in connection with offering goods or services in the EU, or monitoring their behavior.

Covered Personal Information

Under this EU Data Protection Law, Personal data is any information relating to an identified or identifiable data subject. 

The GDPR prohibits the processing of defined special categories of personal data unless a lawful justification for processing applies.

Sensitive Data

The following personal data is considered ‘sensitive’ under the GDPR and is subject to specific processing conditions: 

  • Racial or ethnic origin,
  • Political opinions,
  • Religious or philosophical beliefs,
  • Trade-union membership,
  • Genetic data,
  • Biometric data processed solely to identify a human being,
  • Health-related data,
  • Sex life or sexual orientation.

Anonymous, De-identified, Pseudonymous, or Aggregated Data

Under the GDPR, Pseudonymous data is considered personal data. 

Anonymous data is not considered personal data. 

While the GDPR does not mention de-identified data, the CCPA definition is similar to GDPR’s concept of anonymous data.

Children

The GDPR’s default age for consent is 16, although individual member state law may lower the age to no lower than 13. The person with parental responsibility must provide consent for children under the consent age.

Children must receive an age-appropriate privacy notice. 

Children’s personal data is subject to heightened security requirements.

Privacy Notice

Under this privacy regulation, data controllers must provide detailed information about their personal data collection and data processing activities. The notice must include specific information depending on whether the data is collected directly from the data subject or a third party.

Consumer Rights

The GDPR introduced the following consumer rights:

  • Right to information,
  • Right to access,
  • Right to rectification,
  • Right to erasure,
  • Right to restriction of processing,
  • Right to data portability,
  • Right to objection,
  • Right to avoid automated decision-making.

Contracting

The GDPR requires controllers to enter into contracts with processors to govern the processing of personal data by a processor on behalf of the controller. The contract should include:

  • Type of data,
  • Duration of processing,
  • The rights and obligations of both parties, with specific obligations for the processor.

Data Protection Assessments

The GDPR Article 35, requires data protection assessments when processing personal data for certain functions such as targeted advertising, the sale of the data, certain types of profiling, the processing of sensitive data, and processing that presents a heightened risk of harm to consumers.

Transfer Impact Assessments are required for all transfers of sensitive data outside of the EEA.

Enforcement

The GDPR is enforced by the European Data Protection Board (EDPB) as well as binding decision-making by the Data Protection Authorities (DPA) of the member states.

Private Right of Action

The GDPR does have a provision for private rights of action.

Penalties and Damages

Under the GDPR, administrative fines can reach up to EUR 20 million or 4% of annual global revenue, whichever is highest

Cure Period

The GDPR does not provide a cure period.

Exemptions

The only way to be exempt from the GDPR is if you: 

  • Actively discourage the processing of data from EU data subjects (i.e., block your site in the EU),
  • Process personal data of EU citizens outside the EU as long as you don’t directly target EU data subjects or monitor their behavior.

Data Breach

In the case of a personal data breach, the controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of it, notify the personal data breach to the supervisory authority competent in accordance with Article 55, unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Where the notification to the supervisory authority is not made within 72 hours, it shall be accompanied by reasons for the delay.     

The processor shall notify the controller without undue delay after becoming aware of a personal data breach.

The notification referred to in paragraph 1 shall at least:

a) describe the nature of the personal data breach including where possible, the categories and approximate number of data subjects concerned and the categories and approximate number of personal data records concerned;

b) communicate the name and contact details of the data protection officer or other contact points where more information can be obtained;

c) describe the likely consequences of the personal data breach;

d) describe the measures taken or proposed to be taken by the controller to address the personal data breach, including, where appropriate, measures to mitigate its possible adverse effects.

Where, and in so far as, it is not possible to provide the information at the same time, the information may be provided in phases without undue further delay.

The controller shall document any personal data breaches, comprising the facts relating to the personal data breach, its effects, and the remedial action taken. That documentation shall enable the supervisory authority to verify compliance with this Article.

See how the GDPR compares to other privacy regulations such as CCPA, LGPD, and more on our interactive privacy table.

Compare Now