• Privacy Law Update

Privacy Law Update: January 23, 2023

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!

Newsworthy Updates

US state privacy developments: Indiana, Massachusetts, New York and more

  • State Rep. Gregg Haddad, D-Conn., introduced House Bill 6253, the Connecticut Age-Appropriate Design Code. The bill was referred to the Connecticut General Assembly’s Joint Committee on General Law.
  • Indiana House Bill 1554, an act concerning consumer data protection, was introduced and referred to the House Committee on Commerce, Small Business and Economic Development. HB 1554 is a competing bill to Indiana Senate Bill 5 and includes rulemaking authority.
  • State Sen. Cynthia Stone Creem, D-Mass., introduced Senate Bill 745, the Massachusetts Data Privacy Protection Act. The proposal takes themes from U.S. Congress’ proposed American Data Privacy and Protection Act, including a private right of action.
  • Mississippi House Bill 467, the Biometric Identifiers Privacy Act, was introduced and referred to the Committee on the Judiciary A.
  • New York Senate Bill 2277, the Digital Fairness Act, was introduced and referred to the Senate Committee on Internet And Technology. SB 2277 will run against New York Assembly Bill 1362. Also, New York Assembly Bill 1362, the Biometric Privacy Act, was introduced and referred to the Committee on Consumer Affairs and Protection.
  • State Rep. Andrew Stoddard, D-Utah, introduced an amendment to the Utah Consumer Privacy Act. House Bill 158 amends Utah’s law to include a carveout for law enforcement’s access to personal data with a warrant.
  • The Virginia Senate took up bills to amend the Virginia Consumer Data Protection Act. Senate Bill 1087 proposes provisions to protect genetic data privacy, while SB 1432 concerns protection of personal health records.
  • State Del. Wayne Clark, R-W.Va., introduced House Bill 2460, an act concerning children’s privacy, to the West Virginia House. The bill, which would bring privacy protections for children under age 18, was referred to the House Committee on the Judiciary.

Empowering people to foster trust in tomorrow’s technological advancements

The Information Commissioner’s Office is encouraging developers to consider privacy at an early stage when implementing new technologies to maintain public trust and confidence.

Our Tech Horizons Report looks at technologies emerging over the next two to five years and warns that the significant benefits they offer could be lost if people feel companies are misusing their data.

The report, which follows analysis of key technologies expected to impact society in the future, found businesses must consider transparency, what control people have over their data, and how much data is gathered to ensure their services are data compliant and developed with consumer privacy at the forefront.

Comply with EU rules or face ban, Breton tells TikTok CEO

Chinese social media company TikTok could face a ban in the European Union if it does not step up efforts to comply with EU legislation before September, the top official overseeing the EU’s internal market told the company’s CEO on Thursday.

Report: Online pharmacies share sensitive data with third parties

ProPublica reports some online pharmacies selling abortion pills are using tracking technology that shares sensitive data with third parties, which could potentially lead to prosecution from law enforcement. ProPublica said it found web trackers, including a Google Analytics tool, on at least nine of 11 sites selling the pills. Data shared through the trackers include web addresses visited, items clicked on, search terms, and location and device information, as well as a unique identifier linked to a user’s browser.

Privacy Legislation

Opt-ed: US schools banning TikTok is not an overreaction: U.S. schools are not overreacting by following federal, state and local entities in banning TikTok, University of North Carolina Greensboro professor of management and cybersecurity researcher Nir Kshetri writes in Fortune. “TikTok captures user data in a way that is more aggressive than other” applications, he said, as “its default privacy settings allow the app to collect much more information than the app needs to … function.” One example, he said, was TikTok accessing users contact lists and calendars every hour. There are also significant cybersecurity vulnerabilities that allow hackers to distribute “malicious software” using viral trends.

Irish DPC fines WhatsApp 5.5M euros, fissure with EDPB continues: Ireland’s Data Protection Commission completed its inquiry into Meta platform’s WhatsApp Ireland and fined the company 5.5 million euros related to transparency and forcing users to consent to the processing of their data in the Terms of Service. The DPC found WhatsApp was in breach of “its obligations in relation to transparency” because “information in relation to the legal basis relied on by WhatsApp Ireland was not clearly outlined to users, with the result that users had insufficient clarity as to what processing operations were being carried out on their personal data, for what purpose, and by reference to which of the six legal bases identified in Article 6 of the” EU General Data Protection Regulation, according to the DPC press release. The DPC found the lack of transparency did not meet Articles 12 and 13(1)(c) of the GDPR.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy Law Update

Privacy Law Update: January 9, 2023

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!

Newsworthy Updates

Several State Legislatures Introduce Privacy Bills

State Sen. Whitney Westerfield, R-Ky., reintroduced Senate Bill 15 to the Kentucky Senate. The comprehensive privacy bill did not make it out of committee in 2022. Additions to the bill include Global Privacy Control recognition and increased user consent requirements.

New York Senate Bill 365, the New York Privacy Act, was re-filed and assigned to the Senate Committee on Consumer Protection.

Tennessee Senate Bill 73, the Tennessee Information Protection Act, was introduced. The Tennessee General Assembly failed to pass proposed privacy legislation in 2022.

Washington House Bill 1155, the My Health My Data Act, was pre-filed ahead of the 2023 legislative session on Jan. 10. The Office of the Washington State Attorney General requested the bill be run.

2023 brings US state privacy law preparedness into focus

Chatter regarding comprehensive U.S. state privacy law picked up steam once again as the calendar turned to 2023. State legislative sessions are ready to commence and questions are swirling about which states could make a run at, or ultimately pass, legislation. However, the story of 2023 might be more about handling previously passed state laws. A compliance extravaganza kicked off on Jan. 1, as the California Privacy Rights Act and the Virginia Consumer Data Protection Act took force. Laws in Colorado, Connecticut, and Utah will also go live at different points in 2023.

TikTok Ban Passes Senate, Could Result in Removal From Government Devices

Legislation that would keep TikTok off of government devices has cleared the first major hurdle in Congress, passing the Senate with unanimous consent on Wednesday. A companion TikTok ban bill has been introduced in the House but has not yet been taken up.

Cross-context behavioral advertising is ‘sale.’ It is time to get over it.

It seems like at the start of every year there are new privacy laws. The 2020 new year brought us the California Consumer Privacy Act. The 2023 new year will bring us the California Privacy Rights Act and the Virginia Consumer Data Protection Act, with new legislation from Colorado, Connecticut, and Utah arriving a bit later in the year. So yet again, cross-functional privacy teams from across the digital advertising industry are trying to decipher what companies can and can’t do under new state privacy laws in an environment with little precise guidance about how exactly these laws apply to digital advertising and with little time left for interpretation, no less implementation.

Privacy Legislation

IAB Warns Against Rushed Passage of Children’s Privacy Bill: IAB’s Executive Vice President for Public Policy Lartease Tiffith says the trade association – representing over 700 brands, publishers, agencies, ad tech firms, and more across the digital advertising industry – supports children’s privacy, but including significant changes to the Children’s Online Privacy Protection Act (COPPA) in a hurried “omnibus” spending bill risks unintended consequences for internet users of all ages.

South Dakota Enacts TikTok Ban for Government Employees Over National Security Concerns: TikTok will need to be scrubbed from any government devices in South Dakota, as the state becomes the first to ban the app over national security concerns. The prospect of a national TikTok ban has been in the air since the final year of the Trump administration, with the Biden administration as well as Congress and the Committee on Foreign Investment continuing to scrutinize the app for potential risks of it being used as an espionage or propaganda tool by the Chinese government.

OECD Nations Sign Privacy Agreement Aimed At Improving Transparency Into Government Access of Personal Data: The 38 member nations of the Organisation for Economic Co-operation and Development (OECD) and the European Union have signed a notable privacy agreement aimed at improving transparency in government access to personal data held by private companies. The privacy agreement consists of a list of “shared principles” drawn from “commonalities” in existing national laws. OECD includes the United States, Canada, Australia, New Zealand, Japan, Korea, and Mexico among its members, and some of these countries have limited or no data privacy laws at the national or federal level.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy

Why Students Need to Protect Their Privacy When Using Educational Technology: Tools and Tips

Progress and the introduction of digital technologies in any sphere of human activity have strengths and weaknesses. So we will talk about education, then we are talking about the positive impact of using third-party applications and resources on student engagement and academic performance, and the danger of confidential information leakage. How to make work with the necessary and popular today EdTech tools safe, who should be involved in the process of creating a reliable environment, and what methods to use, we will deal with these issues today.

The problem of protecting confidential data in educational technologies

The problem, first of all, lies in ourselves. Have you ever wondered how many people read the terms of a data processing agreement before checking the box next to it? A study was conducted that showed that as many as 36% of those surveyed never read a company’s privacy policy, and this is the number one issue that needs to be addressed in cybersecurity education.

Another problem for students, teachers, parents, and administration is dozens of used and unused applications that are tied to work accounts. If you do at least a superficial study, you will find that at all levels, from school administrators to students, work accounts are tied to dating sites, travel apps, etc. That is, they open the doors through which you can get to important and confidential information intended only for business purposes.

And of course, the purchase and implementation of insufficiently tested, protected, and certified digital products. Today, services for the education segment, both for administration and direct use in classrooms, are a multi-billion dollar market with huge competition.

When choosing components of a security system, such as classroom security cameras, or products for their school and program, administrators and teachers must pay great attention to how the product protects users’ sensitive data. Especially you need to be careful with trial and test free versions, which still require registration and linking to an account. Such a “Trojan horse” can contain any filling, from banal poor protection to malware that can open the way to all the data stored in the account.

Therefore, both students and teachers should give preference to exclusively safe educational services. Studocu educational platform will help you avoid a lot of data security issues.

In fact, there are many more problems, they can be global, typical for the entire education industry, or local, arising in a particular region, users of a particular product, or in one school. But to know how to avoid confidential data leakage, in any case, you need to know how to protect yourself and your students, teachers, and parents as much as possible.

6 Tips for Protecting Student Data

The first step that must be taken to ensure the safe use of EdTech tools is to understand that there is a problem. And that this is a global problem that poses a real threat. Stolen data can be used for a variety of purposes, from advertising to criminal offenses. Only by realizing the depth of the problem and the consequences of a careless attitude toward it, one can begin to act seriously.

1. Review all the tools you have used or are using

During the pandemic, many educational platforms have made their services free or shareware. In an effort to give their students as many learning opportunities and more convenient tools as possible, school administrators and teachers have tried many of them. They manually entered the data of students, not fully realizing how much this data would be protected.

Now you have stopped using these services, choose a few of the most suitable ones. Perhaps they were even transferred on a paid basis, but this does not mean that the data that has already been entered is lost or protected. Be sure to make sure your information is secure.

2. Don’t take it all on yourself

The safety of these children is the task of all adults who enter the zone of education of this child. It’s nice to know that after the pandemic when many parents were at home with their children, took part in their studies, and were interested in educational services and tools, they became more attentive to data security.

Parents began to pay attention to what information and where their child contributes. Particularly attentive and responsible called teachers and asked how secure the information that my child enters into a particular application used at school is.

I would like to emphasize this point separately, cybersecurity and the protection of student data are the responsibility of not only schools but also families. Therefore, parents need to know which applications their children use at school and which ones they use outside of school. How reliable and safe are these products?

3. Be aware of the privacy policies of your digital education providers

If you decide to use the tool for the first time, be sure to contact the supplier. Read its privacy policy. For any doubtful or not entirely clear point, ask for an explanation. If you have doubts about the reliability of protection and safety, it is better not to use the product. There are thousands of companies on the market today, and you will definitely find a reliable analog or similar service.

Watch for changes to the privacy policy of companies that already supply you with such a product. Let’s take an example. Schools in one small town all used one popular service for their educational process. On the same day, all responsible persons of these schools received a letter notifying them that there were changes in the privacy policy of the company and the product, so they need to give your agreement on a new one. All the responsible persons of all the schools did this without much trial, and only one teacher decided to ask what exactly had changed in politics.

He learned that the company can now track where children are by geolocation. Please note that we are talking about minors. The teacher raised this issue at his school, the principal tried to negotiate with companies to remove tracks from the app. As a result, the school refused to cooperate with this supplier. But other schools also did not stand aside, since this update did not suit very many. As a result, after customers massively began to refuse to work with an application that tracks the location of children, this item was removed.

4. Create an Application Verification Protocol

For these purposes, it will not be superfluous to use some useful tools. For example, you can start by checking the accounts of all students and school employees using ManagedMethods to get an idea of how they use data, what applications they use, and how much they follow data protection rules.

You can also involve your IT department in developing a protocol for working with new applications and periodically checking working tools for data security.

5. Develop a plan in case of information leakage

You don’t want to think that such a problem may arise for you, so when this happens, everyone is not ready and does not know what to do. First of all, you need to make it clear to parents and students that such a situation has occurred. This can be done via email or SMS, on behalf of the administration of the educational institution or a special IT department.

Also, cybersecurity specialists can help you develop an algorithm of action in such cases. For example, students must register with a data theft protection service. It is advisable to introduce this algorithm to students and parents before the trouble has already happened so that after a leak or misuse, each of them knows what to do.

6. Be sure to have conversations with students and their parents about the rules of behavior on the Internet

For example, you need to talk about the fact that the browser search history needs to be cleaned regularly, and use special tools. You need to minimize your footprint on the Internet and prevent attackers from getting as much information about you as possible.

Be sure to explain to students and teachers what the rules of behavior in social networks are. You should not upload photos of the entire class to teachers on a personal page, or it is advisable to “blur” the faces of children using simple and free tools. You should not attach photos that show the names of your streets of residence, the route to and from school, and the addresses of the sections where you visit. Do not turn on geolocation and give the opportunity to strangers to determine their location.

Safety is the concern of each individual and all together. If we are all careful and responsible, then no one will be able to misuse our data and even just get it.

  • Privacy Law Update

Privacy Law Update: December 5, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!

Newsworthy Updates

How the IAB Multi-State Privacy Agreement Can Help Advertisers Meet their 2023 Privacy Challenges

The IAB Multi-State Privacy Agreement (MSPA) is an industry contractual framework intended to aid advertisers, publishers, agencies, and ad tech intermediaries in complying with five state privacy laws that will become effective in 2023.

A View From DC: 3 Out of 5 Attorneys General Prefer Data Minimization

From the outset of its lengthy rulemaking process, the U.S. Federal Trade Commission has been clear about the endeavor’s multiple overlapping goals. In addition to crafting eventual rules on data security or commercial surveillance, the FTC seeks to establish a robust public record of comments from a wide variety of stakeholders. Such a record may well help to inform policy making and enforcement activities, even beyond the four walls of the commission. It can also provide businesses with important signals about the evolution of best practices in the modern privacy landscape.

Risk-Based Approach to International Data Transfers Necessary to the Future of Cross-Border Data Flows Under GDPR

In the wake of the Schrems II decision, the EU’s lead data regulators have largely adopted an absolutist view that any potential for harm due to interception by a foreign government (even if trivial) is a GDPR violation and that these governments must demonstrate parity with the bloc’s data privacy laws before they can become a trusted partner. A new paper from global law multinationals DLA Piper and Clifford Chance lays out the case for a risk-based approach to these international data transfers, arguing that the status quo is too onerous and that data exporters are suffering from unfair burdens under an “unlawfully strict” interpretation of the Schrems ruling.

Fourth Draft of India Data Protection Bill Proposes Government Exception From All Provisions

India’s journey toward a national data protection bill has been going on for over four years, in fits and starts as drafts have been proposed and then subsequently shot down. The latest draft looks to be no less contentious, as it adds vital protections but also exempts the country’s government from all of its terms and appears to give tech platforms a fairly free hand in sending citizen data overseas.

Privacy Legislation

California Privacy Rights Act: The new CPRA was approved by voters in the November general election and officially became law on December 16, 2020, five days after election results were certified. The CPRA substantially amends and amplifies the requirements of the CCPA, bringing California privacy law closer, in many respects, to Europe’s GDPR.

Australia passes Privacy Legislation Amendment Bill 2022: The Parliament of Australia approved final passage of the Privacy Legislation Amendment Bill 2022. The bill amends the Privacy Act of 1988 to increase data breach fines to AU$50 million, or penalties based on data monetization and 30% of adjusted quarterly turnover under a new three-factor penalty scheme. Australian Information Commissioner and Privacy Commissioner Angelene Falk said the changes create “closer alignment with competition and consumer remedies” under the EU General Data Protection Regulation and “facilitate engagement with domestic regulators and our international counterparts to help us perform our regulatory role efficiently and effectively.”

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Marketing
  • Privacy

Protect Your Privacy While Shopping This Holiday Season

With the gift-giving season right around the corner and inflation on everyone’s mind, Black Friday deals and Cyber Monday are set to be blockbuster events this year. But before heading off to your favorite retailer’s website, consider a few tips to protect your privacy and potentially get you the best price this holiday season:

  • Check URLs carefully. Black Friday and Cyber Monday are primetime for phishing attacks. It’s all too easy to click on a fake link after that big meal and some wine or when waking up early on Monday. Look for the lock icon in the address bar and check to make sure the domain matches.
  • Use a VPN when bargain hunting. Ever get the feeling retailers know what you want before you ask? Or see a great price suddenly disappear after searching for a while? Companies are increasingly using your data from multiple sites to determine pricing in real-time. Using a VPN can help you seek out the best deals, keep retailers in the dark, and offer a little extra security.
  • For the most privacy, avoid downloading retailer-specific apps. Retailers can gather much more information about users when using their apps directly. Consider sticking to their websites if possible.
  • Check the privacy settings on those heavily discounted devices. Many of those doorbuster deals are being offset by increasing personal data collection and displaying additional advertisements in menus. Privacy-minded shoppers can edit those settings to protect themselves. You may also consider leaving the device off the internet and using a dedicated streaming device.

Happy shopping!

  • Privacy Law Update

Privacy Law Update: November 21, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!

Newsworthy Updates

New Privacy Laws in 2023

There are five states with new comprehensive consumer privacy laws taking effect in 2023 — California, Virginia, Colorado, Utah and Connecticut. While businesses are well-advised to start their compliance efforts early, the lack of final implementing regulations from some states makes complete compliance impossible at this time. California and Colorado recently released draft regulations for comment.

Consumer Reports Innovation Lab Launches App for Data Rights Protocol

The Consumer Reports Innovation Lab is launching a testing application for the Data Rights Protocol, an open standard for exchanging data rights requests to make it easier for companies to comply with consumer privacy laws. The app, OSIRAA v0.5, is available on GitHub and can be used by “both authorized agent and privacy infrastructure provider companies looking to implement the protocol,” Lead Engineer John Szinger said. With OSIRAA, Szinger said companies “can test against the protocol on their own and refine their implementations as needed.”

A View From DC: Lame Ducks and Safe Kids

Most senators who sponsored privacy legislation are instead indicating that they will focus their energy in the lame duck session on youth privacy. The best, or perhaps only, path for this would be to attach a proposed bill to a piece of “must-pass” legislation, like an omnibus spending bill. At the moment, the Kids Online Safety Act represents the clearest candidate for such an endeavor. Axios reports that both sponsors of the bill, Sens. Richard Blumenthal, D-Conn., and Marsha Blackburn, R-Tenn., intend to give this a shot.

India Proposes Digital Personal Data Protection Act 2022

India’s Ministry of Electronics and Information Technology proposed new privacy legislation, the Digital Personal Data Protection Act, 2022. The draft bill aims to enable personal data processing while recognizing individuals’ rights and “the need to process personal data for lawful purposes.” It allows cross-border data transfers with “certain notified countries and territories” and establishes a Data Protection Board to oversee compliance and impose penalties, stated not to exceed 5 billion rupees. The ministry welcomes public feedback on the draft bill until Dec. 17.

Privacy Legislation

DPC 2022: EU-US Data Privacy Framework On Track, Schrems Challenge to Come: Well-known and influential names entrenched in the ongoing discussions around EU-U.S. data flows made their presence felt in back-to-back breakout sessions to cap off the final day of the IAPP Europe Data Protection Congress in Brussels, Belgium. EU and U.S. government officials took the stage focused on further touting and cementing the pending EU-U.S. Data Privacy Framework’s workability. NOYB Honorary Chairman Max Schrems threw cold water on those notions, all but announcing he will attempt to raise a potential “Schrems III” challenge to the Court of Justice to the European Union.

Home Stretch: Finalization of CPRA Regulations Draw Close: The delay on California Privacy Rights Act regulations has proven difficult for everyone involved. Covered entities are in a bind trying to address CPRA compliance ahead of the Jan. 1, 2023, effective date without final rules being promulgated by the California Privacy Protection Agency. On the other hand, the CPPA is trying to work diligently and tactfully in the face of criticism for running well past its initial July 1 deadline to finalize regulations. The pressure on both sides could ease soon though with the CPRA rulemaking process entering the final stretch. The CPPA recently approved modifications to the draft regulations and opened a 15-day public consultation that runs through Nov. 21.

Google Pays Nearly $329 Million to Settle Sweeping Location-Tracking Case: Google has agreed to pay nearly $392 million in a settlement with 40 states over allegations that the company tracked people through their devices after location tracking had been turned off, a coalition of state prosecutors announced on Monday. Authorities said, since at least 2014, Google broke consumer protection laws by misleading users about when it secretly recorded their movements. It then offered the surreptitiously harvested data to digital marketers to sell advertisements, the source of nearly all of Google’s revenue.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy Law Update

Privacy Law Update: November 14, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!

Newsworthy Updates

Home stretch: Finalization of CPRA Regulations Draws Closer

The delay on California Privacy Rights Act regulations has proven difficult for everyone involved. Covered entities are in a bind trying to address CPRA compliance ahead of the Jan. 1, 2023, effective date without final rules being promulgated by the California Privacy Protection Agency. On the other hand, the CPPA is trying to work diligently and tactfully in the face of criticism for running well past its initial July 1 deadline to finalize regulations.

New data privacy laws are coming in 2023. Experts say businesses need to be prepared.

California and other states will implement new data privacy laws in 2023 that could have implications stretching far beyond their borders. Experts say business owners need to be prepared — particularly for California’s sweeping new law. The California Privacy Rights Act takes effect on Jan. 1, 2023, and is expected to reshape the consumer and employee data privacy landscape across the country.

Argentina Finalizes Proposed Data Protection Reform

Argentina’s data protection authority, the Agency of Access to Public Information, announced finalization of its proposed reforms to Law No. 25,326 on the Protection of Personal Data. Following an extended public consultation featuring 173 submissions, the AAPI took up 80 articles in its final proposal and modified 43 based on public comments. The reform package was presented to Argentina’s government for review before introduction to the National Congress of Argentina. Editor’s note: Mariano Peruzzotti, CIPP/E, broke down Argentina’s proposed data protection reform.

A view from DC: Uncertainty reigns

After Tuesday’s election, control of the U.S. House of Representatives remains undetermined, though Republicans only needs seven more seats among the 30 undecided races to assume the mantle. Until the call is made, it is hard to fully analyze how the results of the midterm elections could impact the future of the American Data Privacy and Protection Act, or other comprehensive consumer privacy bills. If Rep. Kevin McCarthy, R-Calif., takes the gavel, he could make good on his support for data privacy legislation, which also appears in his proposed party platform. Note: the next episode of The Privacy Advisor podcast will feature more insights from D.C. insiders about what to expect after the election.

Privacy Legislation

UK, US Announce Initial PETs Contest Winners: The U.K. and U.S. governments jointly announced initial winners in the U.S.-U.K. prize challenges on privacy-enhancing technologies. Twelve of 76 entries moved out of Phase I of the contest with their “state-of-the-art approaches to privacy-preserving federated learning.” The remaining entrants will take part in Phase II of the competition, which tasks entrants with building their proposed technologies with government and regulator engagement. Applications are open for Phase III testers to trial the final products.

California: The California Age-Appropriate Design Code Act may lead to greater privacy protections for minors across the U.S., Government Technology reports. The law, which goes into effect in 2024, could make it “difficult for technology companies to apply different rules to users in different places,” and could compel them to add stronger protections by default. Technology industry groups could still challenge the law because certain provisions “are overly vague,” such as requiring websites to “estimate the age of child users with a ‘reasonable level of certainty,’” which could create a scenario where technology companies collect even more personal data.

Apple Tracks and Collects Data from iPhone Users: An independent study conducted by researchers from software company Mysk indicated Apple tracks and collects data from iPhone users despite user setting preferences, Gizmodo reports. “Opting-out or switching the personalization options off did not reduce the amount of detailed analytics that the app was sending,” researcher Tommy Mysk said. The research was conducted on two Apple devices and included analysis of Apple’s App Store as well as Apple Music, TV, Books and Stocks.

Colorado: The Colorado Attorney General’s Office released public comments received from stakeholders on proposed draft Colorado Privacy Act rules. A total of 61 comments were published, received from March through Nov. 8. The proposed CPA draft rules were published Oct. 10 and a hearing is scheduled for Feb. 1, 2023. Three virtual stakeholder sessions are also scheduled for Nov. 10, 15 and 17. The attorney general’s office said the sessions are a forum to “gather feedback from a broad range of stakeholders for the development of rules to implement the CPA.”

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy

Voting and Your Privacy

Today many Americans will flock to the polls to vote in the mid-term election. There is a lot more at stake than just who you vote for and who ends up in office. What many do not realize is that their personal data is at stake even though their vote is “anonymous”.

The right to an anonymous vote is part of the U.S. democratic process. Yet when someone registers to vote, their voter registration record may be considered public record, and available to a variety of individuals or groups. Long before you walk into a voting booth, you may have received phone calls asking who you plan to vote for, texts promoting a candidate, and even when you apply for a driver’s license you are asked your political leanings. Voting records are not private and this includes, name, address, phone number and sometimes even whether you voted or not.

If this isn’t bad enough, political campaigns have been compiling, sharing, buying, and selling voter lists for decades. Third-party list brokers are compiling and selling your information, and email addresses are shared among candidates. Respecting users’ right to privacy requires thoughtful consideration of how data is used, collected, stored, deleted, and retained. Information safeguards or data security include procedures, practices, and technology for keeping sensitive data safe. The strictness and uniformity with which data privacy rules and regulations are enforced might vary significantly from one jurisdiction to the next.

In 2019, Senator Diane Feinstein of California introduced the Voter Privacy Act of 2019 to amend the Federal Election Campaign Act of 1971 to ensure privacy with respect to voter information. It never passed.

How to protect yourself

There is no easy answer to how to keep your data private. Sure, you can not vote but it is a constitutional right to engage in the political process so privacy must be addressed. Though many states have Address Confidentiality Programs (ACPs), many have challenges controlling this data, so there is no privacy guarantee.

For a detailed look at voter registration privacy protections in any state, check out the National Conference of State Legislatures’ Access To and Use of Voter Registration Lists Report.

Stay up-to-date on privacy legislation

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Marketing

The Expanding Scope of “Sale:” California Data Privacy

When companies discovered that the use of a pixel that shares data directly between your website and a social media platform is a sale of data from a regulatory perspective in California, it caught our attention. The increasingly complicated state of privacy compliance understanding and implementation is challenging to say the least.

Among the sea of change we have worked through in the last several years, one very small, but very important part, is the expanding scope of what defines a “sale” of data which is of vital importance to marketing teams.

WireWheel CEO Justin Antonipillai was joined by IAB Tech Lab EVP and General Counsel Michael Hahn and Davis+Gilbert LLP Partner Gary Kibel to discuss the ramifications of California Privacy and the Expanding Scope of What is a “Sale” of Data, and the marketing challenges it portends.

The Sephora takeaways

If companies make consumer personal information available to third-parties and receive a benefit from the arrangement—such as in the form of ads targeting specific consumers—they are deemed to be “selling” consumer personal information under the law.

—California AG – Sephora complaint

“Everyone is talking about the Sephora action. It is an important action, not just on its merits, but also as it is the first publicly announced enforcement action out of California,” Davis+Gilbert’s Kibel.

He notes that the complaint, among other concerns (including the use of not legally defined buzzwords like ‘surveillance’),  focused on two major issues:

1. Pixels from a third-party provider are on a publisher’s site: Is that a sale of personal information under the CCPA? Or are you in a service provider relationship?

Firstly, opines Kibel, “they were talking about the fact that there could be sensitive data that’s being collected. And If companies make consumer personal information available to third-parties and receive a benefit from the arrangement – such as in the form of ads targeting specific consumers – they are deemed to be selling consumer personal information under the law.”

That said, “if you have a pixel from a third-party provider on your website, and for free, you get great analytics, and in exchange, the provider can use the data generated on the publisher’s site for their own benefit, that may be a sale of personal information.” This then requires providing the consumer the ability to opt-out.

If you are deemed to be selling personal information. You must have a link on the homepage of the website with these six exact words: “Do not sell my personal information.” 

—Gary Kibel, Davis+Gilbert LLP

“There are two avenues here,” Kibel explains: “You can either deem to be selling personal information to a third-party, or you could be in a service provider relationship with that pixel provider. However, if you want a service provider relationship, there needs to be a written contract with that provider restricting the way that they’re going to use the personal information.”

2. Compliance with global privacy control (GPC) signals that are automatically sent by a user’s browser to a publisher’s site.

“As many of us know, there is not a single mention of opt-out preference signals or global privacy controls in the CCPA law but was introduced in the CCPA regulations.” The CPRA (effective January 1, 2023) directly addresses opt-out preference signals at length in the regulations (in draft form) “and makes very clear that you have to honor global privacy controls and opt-out preference signals.

However, the Sephora action made it clear that the California AG said, no, you need to be honoring GPC signals now.”

This makes it really challenging, because the CCPA regulations really don’t tell you anything about how to comply with GPC signals. So, what are businesses supposed to do right now?

Perhaps you could look at the CPRA draft regulations to see what it says and use that as guidance.

—Gary Kibel, Davis+Gilbert LLP

Devising GPC signals and third-party contracts

“One of the important things that you need to do under any privacy law is you need to communicate the consumers privacy elections to the other participants who receive the personal information in a manner that complies with state law,” says IAB’s Hahn.

As a function of technology, the IAB is designing the schematic for this communication ‘plumbing’. “The IAB Legal Affairs Council asked, ‘What do we need to communicate to lawfully process a digital advertising transaction?’ and gave these requirements to the engineers in the Tech Lab and their working groups to translate them into technical specifications. IAB Tech Labs recently released global privacy platform, which is encoded to handle State-level signals,” alerts Hahn.

“The second component concerns what rules need to exist for companies when they send – and receive – the signals. To do this we created an industry contract called the IAB Multi-State Provider Agreement which creates a set of obligations that applies to all the signatories. They spring into place and in the manner that follows the personal information.

“There are a number of requirements for your specific contracts alone, but at a high level, we are creating a common baseline set of privacy terms that could flow through the digital ad chain, and also fill in gaps where you need contracts, but you don’t have them.”

If you spent the next 100 years trying to write contracts, you will not be able to scale with enough of them given the broad definition of sale that exists today as the regulators applied in the digital advertising context, which for all practical matters, seems to apply to nearly every disclosure of personal information.

—Michael Hahn, IAB Tech Lab

The IAB has also created, as an alternative to state-specific rules-based contracting, a “national consumer” program, notes Hahn, for those that opt to treat all consumers the same regardless of where they reside.

The technology implementation

There are three critical support elements to achieving an effective and compliant technology implementation says WireWheel’s Antonipillai.

  1. If you have automated scripts, tags, or pixels that are going directly to a third-party platform, you have to be able to know that it’s not going to go automatically. You have to have a way to control them.
  2. In the context of marketing, you need a place that a human being can come and easily opt-out. You have to make it super simple and easy to find. It has to interact with the automated marketing, it can’t just be the stuff that goes on in your back-end systems. And it has to happen automatically.
  3. You have to strongly consider – some view it mandatory – setting up the infrastructure to accommodate choice in a touchless way. Including via the global privacy control concept.

“This is not a cookie tool,” warns Antonipillai. “Here we are talking about a different kind of exercise. It’s not about not only governing what happens in that browser area where your cookie tool used to live, but on the automated marketing side and what the marketing team does outside of automated marketing (think Adobe, Marketo, Eloqua, Dynamics, HubSpot). The front and back-end have to be communicating.

“You have to have the infrastructure to not only understand it and govern it internally, says Antonipillai. “You have to start thinking about how you’re going to signal through your networks.”

The marketing community is going to have to own this issue. If you go to almost any other jurisdiction, certainly in Europe, when a marketing team is about to run a marketing campaign, privacy and GDPR compliance is typically number one or two on the list. It’s just part of the culture.

—Justin Antonipillai, WireWheel

“My experience from the privacy side” continues Antonipillai, “is that when you’re talking to a marketing professional, if you just ask the question, ‘Are you selling personal data?’ most marketers are going to say, “No,” (unless it’s part of the business plan).

Three critical, more specific, questions need to be asked –

  1. Are we using any scripts, tags, or pixels, to improve our social media ads?
  2. Are we using any technologies or platforms to measure the performance of our ads?
  3. Are we using any technology to cap the frequency that people see our ads?

– to gain a more complete understanding of how data is interacting with social media ads.”

“Marketing techniques like measuring performance and frequency capping often uses personal data, so when engaging with your marketing team, it is important to move away from simply asking the more charged question, ‘Are you selling data?’

“These activities are what some regulators are starting to call a sale and we need to start putting the right technology and notices in place, so you can do this the way you want.

Fortunately, he notes that there are really good technical solutions that allow you to do these things while providing the necessary consumer choice in a touchless way.

The historical model in the United States is for large marketers to say ‘from pillow to my agency this is your responsibility. Make sure everything complies with the law and identify to me if something goes wrong. Changes in the rules have become stressors on that approach.

Requirements around auditing service providers needed in your contracts is one indicator of that. Suddenly there could be sales of personal information that marketers are engaging in or causing others to engage in.

Marketers need to get their arms around this.

—Michael Hahn, IAB Tech Lab



  • Regulations

Colorado AG’s Office Published Proposed Colorado Privacy Act Rules

On Friday, September 30, the Colorado Attorney General’s office published proposed Colorado Privacy Act rules. The Office also announced that it will hold three stakeholder meetings on November 10, 15, and 17, 2022, and a public hearing on February 1, 2023.

The Draft Rules are long and complex and closely aligned with Virginia’s VCDPA and California’s CPRA.  That being said, there are significant differences among them including, the handling of sensitive data, and consumer-facing obligations for compliance with multiple state privacy laws.  The CPA Draft Rules will likely see additional modifications before it is codified.  Below are some of the takeaways from the proposed rules.


Consumer Rights

Consumer rights state that businesses must:

  • Clearly state that they are available to Colorado consumers
  • Provide access to all data rights available under CPA
    • Opt out of the processing
    • Access
    • Correction
    • Deletion
    • Portability
  • Provide a clear explanation of how to exercise consumer rights
  • Meet notice requirements

Similar to the EU’s GDPR, consent must reflect a consumer’s clear, affirmative choice, be freely given, be specific and informed, reflect the consumer’s unambiguous agreement and have the ability for consent to be withdrawn.  The Draft Rules add new requirements for “refreshing” consent.  Businesses must “refresh” sensitive data annually and other data at undefined time periods.

The CPA is not an opt-in law but does require consent for specific use cases:

  • Processing sensitive data
  • Secondary or additional use of data
  • Processing of personal data of minors


Dark Patterns

Data controllers must avoid using “dark patterns” that confuse or manipulate people providing consent.


Biometric Data

A new definition of biometric data was created similar to other state privacy laws requiring controllers to obtain consent for the collection of biometric data.

  • “Biometric Data” means Biometric Identifiers that are used or intended to be used, singly or in combination with each other or with other Personal Data, for identification purposes. Unless such data is used for identification purposes, “Biometric Data” does not include (a) a digital or physical photograph, (b) an audio or voice recording, or (c) any data generated from a digital or physical photograph or an audio or video recording.
  • “Biometric Identifiers” means data generated by the technological processing, measurement, or analysis of an individual’s biological, physical, or behavioral characteristics, including but not limited to a fingerprint, a voiceprint, eye retinas, irises, facial mapping, facial geometry, facial templates, or other unique biological, physical, or behavioral patterns or characteristics.


Consumer Opt-Out Requests

Clarity and direction on how controllers must receive and respond to consumer opt-out requests have been spelled out and include:

  • Providing a method to opt out of personal data processing through a clear and conspicuous link in privacy notices or easily accessible places on websites.
  • Links must go directly to the opt-out mechanism.
  • Opt-outs must be processed within 15 days of receiving valid opt-out requests
  • Providing “reasonable” methods to authenticate a consumer submitting data rights requests.



The privacy notice requirements focus on processing purposes rather than categories of personal information and contain obligations for controllers including:

  • Privacy notices must clearly indicate which data subject rights are available to Colorado residents.
  • Disclosing the “express purposes” for each type of personal data collected and processed, providing consumers with a “meaningful understanding of how their personal data is used and why their personal data is reasonably necessary for the processing purpose.”
  • Adhering to the principles of purpose specification and data minimization.
  • Purposes must be documented
  • Personal data that allows identification of consumers should be kept only so long as necessary, adequate or relevant to the specified, express purposes.
  • Processing of personal data for a purpose that is not reasonably necessary or compatible with the purpose(s) stated at the time of collection requires consumer consent.
  • Notifying consumers of material changes to the privacy notice 15 days before the change goes into effect.


Loyalty Programs

Extensive disclosure requirements were created around bona fide loyalty programs that provide discounts, rewards or “other actual value” to consumers.

  • Controllers may not increase the cost of or decrease the availability of a product or service based solely on a Consumer’s exercise of a Data Right
  • Controller is no longer obligated to provide that Bona Fide Loyalty Benefit to the Consumer If:
    • a Consumer exercises their right to delete Personal Data making it impossible for the Controller to provide Loyalty Program benefits.
    • a Consumer refuses to Consent to the Processing of Sensitive Data necessary for a personalized Loyalty Program benefit.
  • Controllers must notify the Consumer if Consumer’s decision Impacts the Consumer’s membership in a Loyalty Program.


Unified Opt-Out Mechanisms

As required by the CPA unified opt-out mechanism (UOOM) requirements have been defined.

  • UOOMs must have an easy path for consumers to exercise opt-out rights with all controllers rather than having to make requests with each.
  • Controllers must offer consumers a way to provide an affirmative, freely given and unambiguous choice to opt out of personal data processing for targeted advertising, sales or both.
  • Controllers must adhere to notice and choice, acceptable default settings, technical specifications for recognizing and honoring opt-out requests.


Data Retention

Controllers must create and enforce document retention schedules.


Sensitive Information

Sensitive data “inferences” is a new category of sensitive data created in the Draft Rules. Inferences include personal information collected from a consumer that a company uses to infer a sensitive data category.  Sensitive data inferences:

  • Require prior consent for processing.  Under certain circumstances consumers over age 13 can be processed without consent.
  • Must be deleted no later than 12 hours after collection if controllers do not have consent.



Data Protection Impact Assessments (DPIAs) are required for processing activities that present a “heightened risk of harm” to Colorado consumers.  DIPA’s must:

  • Be a “genuine, thoughtful analysis” of all aspects of a controller’s organization structure.
  • Include the specific purpose of the processing, procedural safeguards, names and categories of third-party recipients of personal data and risks to consumers.
  • Must be revisited and updated at least annually.



The right to opt of profiling is prominently contemplated in the Draft Rules and create three tiers of profiling:

  1. Solely automated processing,
  2. Human reviewed automated processing
  3. Human involved automated processing

Companies may deny requests to opt out of profiling if “human involved automated processing” was used and details must be provided to the consumer.

In addition to the profiling tiers companies must:

  • Provide a means for consumers to opt out of profiling decisions that “produce legal or similarly significant effects”.
  • Provide consumers with a notice that includes a plain-language explanation of the logic used in the profiling process and disclose whether the profiling system was evaluated for accuracy, fairness or bias.

See how the Colorado CPA compares to other global privacy regulations such as CCPA, GDPR, and more on our Privacy Law Matrix.

  • Regulations

California Privacy Protection Agency Issues Newly Modified Regulations on CPRA

On Monday, September 17, 2022, the California Privacy Protection Agency issued modified proposed CPRA regulations and accompanying explanations. The modified proposed regulations were influenced in part by the large volume of comments collected during the 45-day written comment period on the first round of proposed regulations, the  public hearings held in August and subsequent Agency board meetings in September.  The next round of Board meetings are scheduled for October 28 and 29 where they will adopt or modify the 28 items called out in the draft regulations. If and when the requatons will be finalized is unknown and likely to follow the same path CCPA proposed regulations did in 2020. The proposed regulations still do not completely address the new law and further rulemaking should be expected, particularly around employee data.

General Overview of the Proposed Regulation Modifications

Collection and Use of Personal Information

The proposed regulations require businesses processing personal information to be “reasonably necessary and proportionate” as it relates to the collection and processing of that data. The earlier version of regulations saw this through the lens of a “reasonable person”.  The revised language adds to this by considering three different sets of criteria:

  • Can the businesses determine proportionality and necessity?
    • What is the relationship between the consumer and the business?
    • What type, nature, and amount of personal information does the business seek to collect or process?
    • What is the source of the personal information and the business’s method for collecting or processing it?
    • What is the specificity, explicitness, and prominence of disclosures to the consumer about the purpose for collecting or processing the consumer’s personal information, such as in the Notice at Collection and in the marketing materials to the consumer about the business’s good or service?
    • To what degree is the involvement of service providers, contractors, third parties, or other entities in the collection or processing of personal information apparent to the consumer?
  • Are disclosed purposes compatible with the context in which personal information was collected?
    • At the time of collection of the personal information, what are the consumer’s reasonable expectations concerning the purpose for which the personal information will be collected or processed?
    • What are the other disclosed purposes for which the business seeks to further collect or process the consumer’s personal information?
    • Does a strong link exist between the consumer’s expectations that the personal information will be used to provide them with a requested service at the time of collection, and the use of the information to repair errors that impair the intended functionality of that requested service?
  • Factors for determining when processing is reasonably necessary and proportionate to the purpose for which it was collected
    • What is the minimum personal information that is necessary to achieve the purpose identified?
    • What are the possible negative impacts on consumers posed by the business’s collection or processing of the personal information?
    • What are the additional safeguards for the personal information to specifically address the possible negative impacts on consumers considered by the business?

Dark Patterns

Modifications regarding dark patterns should be taken in context of previous regulations covering many of the same topics including the same language removed from the newly proposed regulations around the avoidance of dark patterns. The Agency modified regulations removing a number of requirements including:

  • A choice where the ‘yes’ button is more prominent (i.e., larger in size or in a more eye-catching color) than the ‘no’ button is not symmetrical” and therefore improper.
  • References to businesses not using “manipulative language” or “wording that guilts or shames the consumer into making a particular choice.”


This section had several impactful changes including:

  • Notice at collection no longer needs to identify information regarding third parties that collect personal information through the business.
  • Modifying definitional relationships with analytics providers as third parties. The explanation now reads in some instances an analytics business can be a service provider and not a third party. As exemplified in the Sephora case this will be a particularly important change if accepted.
  • Deleting subsections dealing with the collection of employment-related information. The explanation states that these subsections were deleted to “conform the regulations to the law following the expiration of the” employee data exemption.

Sensitive Personal Information

The modified language around the limitations of the use of sensitive personal information clarifies that a business:

  • Does not need to provide a Notice of Right to Limit or the “Limit the Use of My Sensitive Personal Information” link if the sensitive personal information does not infer characteristics about a consumer.
  • May display through a toggle or radio button (but not mandatory) that confirms requests to limit sensitive personal information, as well as opt-out preference signals, and opt-out requests were processed by the business.
  • Can use sensitive personal information to prevent and investigate certain types of security incidents.

Opt-Out Preference Signals

The modified proposed regulations still require businesses to recognize opt-out signals and as stated above not required display whether they have recognized the signal.  Businesses may still provide this functionality as they choose.

See how the CPRA compares to other global privacy regulations on our interactive privacy table.

Compare Now
  • Privacy Law Update

Privacy Law Update: October 18, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!

Newsworthy Updates

CMOs Are on Their Toes and Not Conducting ‘Business As Usual’ As Data Privacy Regulators Get More Assertive

CMOs are a bundle of nerves these days. Blame data privacy regulators for some of it. Sure, the threat of a global recession keeps marketers awake at night, but being named and shamed in headlines of The New York Times for data privacy breaches is the stuff of nightmares. But until recently, those nightmares never materialized. After all, it was the platforms and ad tech vendors that were in headlines for data snafus, not advertisers.

White House OSTP Releases ‘Blueprint for an AI Bill of Rights’

The White House Office of Science and Technology Policy published “Blueprint for an AI Bill of Rights,” which provides design, development and deployment guidelines for artificial intelligence technologies. Data privacy, algorithmic discrimination protections and user choice principles are among the OTSP’s “five common sense protections to which everyone in America should be entitled.” The OTSP said the blueprint is “a vision for a society” and its AI use focuses on protections from the onset, input from marginalized communities and realizing technological benefits.

Colorado Releases Proposed Privacy Rules, Further Complicating National Compliance Landscape

The Colorado Department of Law filed a set of proposed rules to implement the Colorado Privacy Act (Draft CO Rules) on Sept. 29, 2022, foreshadowing additional compliance obligations that businesses will have to strive to meet in 2023. The level of detail in the document – which is nearly 40 single-spaced pages in 10-point font – stands in stark contrast to the underlying law, which is high level and largely parrots the Virginia Consumer Data Protection Act (VCDPA). Though the Draft CO Rules are not as proscriptive as the proposed California Consumer Privacy Act (CCPA) rules regarding consumer-facing requirements, the Draft CO Rules focus much more heavily on data governance and management of sensitive data.

Britain to Replace GDPR Data Privacy Regime With Own System

Britain will replace the European Union’s data privacy regime known as the General Data Protection Regulation (GDPR) with its own system, culture secretary Michele Donelan said on Monday.  “We will be replacing GDPR with our own business- and consumer-friendly British data protection system,” Donelan said, speaking at the annual conference of Britain’s governing Conservative Party in Birmingham.

IAB Tech Lab Launches Global Privacy Platform

IAB Tech Lab finalized the Global Privacy Platform, designed to help communicate and manage user consent signals from various jurisdictions. The GPP supports signals for the Global Privacy Control and the IAB Europe Transparency and Consent Framework as well as consent strings required under comprehensive state privacy laws. More jurisdictions will be added as global regulations come online. For state law compliance, IAB Tech Lab urged companies to transition from its U.S. Privacy Specifications tool to the GPP, which will be “the only platform to accommodate upcoming and future privacy and consent management requirements in the U.S.”

Privacy Legislation

Executive Order on EU-U.S. Privacy Framework: On October 7, President Biden signed a long-awaited executive order (EO) which implements a new EU-U.S. data privacy schema. The EO is intended to address the concerns raised in the 2020 Schrems II decision, in which the Court of Justice of the European Union (CJEU) invalidated the EU-U.S. Privacy Shield, the mechanism that had previously provided the legal basis for data transfers between the EU and the U.S. under the GDPR. In Schrems II, the CJEU ruled that the U.S., in allowing for overly-invasive government surveillance, did not provide sufficient protections for the data of EU citizens.

FTC on Dark Patterns: On September 15, the Federal Trade Commission (FTC) released a staff report on dark patterns, identifying and analyzing four major categories of manipulative design. The report is the outcome of conversations kicked off in the FTC’s virtual ‘Bringing Dark Patterns to Light’ Workshop (Apr. 29, 2021), which convened experts, advocates, and representatives from the business community to discuss the use of manipulative design online and its impact on consumer autonomy.

  • The report focuses on economic harm, but is also concerned with the confusion and shame that consumers may experience after being misled by manipulative design.
  • There are also fair competition considerations at work in the agency’s contemplation of “dark patterns.”

American Data Privacy and Protection Act (ADPPA): ADPPA’s preemptive effect on state privacy laws, especially those in California, remains one of the most controversial aspects of the bill. In a recent Slate article, Professors Danielle Citron and Alison Gocke suggest that lawmakers could provide California with a waiver to continue to set its own privacy standards, similar to the waiver granted to California for environmental regulations.

Colorado: On September 30, the Colorado Attorney General’s office published a draft of regulations implementing the Colorado Privacy Act. The draft regulations are highly detailed and complex.There are eleven “High Level Takeaways” from the draft rules; including their creation of new definition of “biometric data,” distinct from the definition used in other state privacy laws; the substantial requirements the draft rules create around unified opt out mechanisms; and the rules’ creation of a new category of sensitive data called “sensitive data inferences,” which must be deleted within 12 hours if collected without consent from children under age 13.

The Colorado Attorney General’s office will hold stakeholder meetings seeking feedback on the draft regulations on November 10, 15, and 17, 2022 and a public hearing on February 1, 2023.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Regulations

California Employee DSAR Requests: What You Need to Know 

Going into effect January 1, 2023, the California Privacy Rights Act (CPRA) covers companies that:

Who Needs to Comply with CPRA in 2023?

The CPRA introduces a number of concepts not enumerated in the CCPA:

  • Data collection and use should be “reasonable and proportionate.”
  • Consent for the collection and use of that data must be obtained
  • Enhanced notices on your privacy pages and at points of collection must be provided
  • Assessments for risky behavior and for sharing data with third parties and service providers are required
  • Contracts with third parties and service providers must obligate them to upholding CPRA when processing data

Importantly, the CPRA has expanded consumer rights including correction, opt-out of automated decision-making, access to information about automated decision-making, and restricting the use of sensitive personal information.

The big topic is that under CPRA is the expiry of the exemption for employee, HR, and business-to-business data. If you have employees or use contractors in California this will be important for you to know and understand.

To discuss the challenges with employee DSAR fulfillment and what to do to get prepared WireWheel’s CPO Rick Buck, and VP of privacy Sheridan Clemens delivered the presentation “California Employee DSAR Requests: What you need to know.”

Which employee and B2B data are covered under CPRA?

Beginning January 1, 2023, data rights will encompass consumers, employees (inclusive of job applicants) and B2B data which includes subcontractors and independent contractors– their owners, directors, and officers – in the context of employment or job applications.

What’s interesting is that prior to CCPA and CPRA, the State of California already had a series of employment rights for HR Data – e.g., payroll records, employment agreements, and personnel files – providing the right to access, correct, and to not to be discriminated against.

CPRA is calling out specific rights now that employees have in California. They too now will have the right to opt out of automated decision making; be informed about the data being used to make automated decisions; and the right to restrict the use of sensitive personal information.

—Rick Buck

What used to apply only to the consumer, now includes your workforce.

One issue that requires more clarity is the treatment of a California business’ remote workers located outside of California. A reasonable assumption is that the CPRA applies. “The CPRA applies to anybody that is doing business in California,” opines Buck. “You are a workforce member, you have a B2B relationship…that you are an employee based in California. But I don’t know if it precedent has been formally set.” [1]

WireWheel’s Clemens notes that the employee does need to be a California resident (the CPRA is written for California residents), so if the remote worker is not a California resident CPRA would not apply. Conversely, if an employee works in California, but the company headquarters is in a different state, the CPRA does apply if the business is a covered entity.

That said, “many companies are weighing whether they will offer it to all of their employees as a way to keep the playing field level and avoid any issues.”

Some rights might not be relevant

Some of the rights in CPRA may not apply in an employment context, notes Buck.

“The right to opt out of sale/sharing in particular, might not be applicable as employers typically don’t sell employee data. They don’t track employees for targeted advertising.

Furthermore, “the right to limit the use of some of sensitive personal information likely also doesn’t apply in this context. Sensitive PI that’s collected is typically only used for human resources purposes such as either work related, payroll, or potentially health related information.”

There’s going to need to be some clarity about whether or not this data is in scope. The answer to that question is going to influence the way in which you as employers are going to respond to your access request.

—Rick Buck

Challenges Fulfilling employee v consumer DSARs

The first big challenge is that employee data tends to live in different places than consumer data. Companies are going to have to be working with different departments and systems for DSAR requests. And this is going to require a lot of training.

—Sheridan Clemens

Managing employee DSARs will require new processes and workflows, and this work, if not already begun, should start now. It’s not an easy uplift.

In the context of employee data, information outside the scope of CPRA may be exposed. “There’s a lot of data collected about employees, and you’re sorting through things like email and word documents that may contain another employee’s data, or protected information like trade secrets and other confidential or proprietary information,” advises Clemens. Redactions may be required.

In short, more scrutiny will be required, and this can take a lot of manpower.

We expect that the California privacy authority is going to recognize the need for balance. Perhaps some concessions that make it reasonable for business to comply without infringing the rights of the individuals. “I don’t think anything is set in stone here,” avers Clemens. “Be prepared to make some judgment calls.”

Conflict with California employment law is another big unknown. Will it supersede the California employment laws, or will California employment laws take precedence in the employee context?

What companies need to start doing today

  1. You have to inventory your data
    While you may have done this for your consumer, when it comes to employees, there’s probably new systems and business processes in scope. You have to talk to HR and education is going to be vital as is understanding exactly what data is collected, where it is being stored, and how it is being used.
  2. Understand if you sell/share or process sensitive PI
    Make sure you’re really clear about selling or sharing personal information. That you know where that data is going, and that you’re giving your employees the right to opt out where applicable.While there is data you need to fulfill an obligation, if you are using it for any other purposes (wellness or other incentive programs), you’ll need to provide your employees the opportunity to opt out.
  3. Update third-party contracts
    CPRA requires data processing agreements for all service providers and contractors processing workforce personal information so be sure all service providers are prepared to support your DSAR requirements.
  4. Review and update privacy policies
    Privacy updates are needed to comprehend personal information in the employment and B2B context: to delineate categories of personal information and sensitive PI collected and processed; purposes for the processing; the retention period by category of PI; a description of the rights available; and instructions on how to exercise those rights.
  5. Update your DSAR portal
    Additional functionality and workflows are needed to process workforce subject rights. Considerations include securing the data, granting the right groups access to it, and generally, having DSAR workflow for employees built into the portal. Both the DSAR portal and your website require updating.
  6. Workflows for employee and B2B data
    Additional functionality and workflows will need to be created to process workforce DSARs. As alluded to above, this will likely be the most significant undertaking in facilitating DSAR fulfillment.

There is a lot to consider given the sensitivity of employee data.

You may not want to share your employee data with your privacy team. HR may want to take the lead. In either case, you definitely want to have legal look it over before you send out your DSAR response.

With employee data, there’s a much higher concern that this information could be prelude to a complaint or lawsuit which will entail challenges around possible legal holds and other factors.

—Sheridan Clemens, WireWheel

Many companies are going to choose to have HR manage these requests. There’s quite a bit of sensitive data that will be exposed and it makes sense to have an HR professional involved in shepherding the process forward. That said, if your HR team is going to be involved in processing DSAR requests, they absolutely need to receive specialized training.

However, you choose to handle employee DSARs, you should have discussions with your legal team, privacy team, and HR team. Importantly, if you don’t have one, create an employee data classification policy and the governance roles around how that data is handled.

WireWheel has been a trusted partner in advancing data privacy capabilities with a full service offering to support these efforts. We have employee subject rights fulfillment as part of our DSAR package and routinely help businesses implement data inventory, mapping, and governance, managing privacy policies, PIAs, and high-risk processing impact assessments.

[1] WireWheel is not a law firm and does not provide legal advices. Any information or materials that WireWheel provides, including but not limited to presentations, documentation, forms, and assessments, are neither legal advice nor guaranteed to be accurate, complete or up to date. 

  • Regulations

Getting Ready for CPRA – Answers to Your Questions

During WireWheel’s webinar on preparing for CPRA, we received many questions concerning the details of what needs to be done to be prepared for January 1, 2023. Below are answers to some of those questions.


All service providers, contractors and 3rd parties processing personal information on your behalf need to be contractually bound to CCPA/CPRA compliance.

Assessments can be used to identify as much information as possible for you to make a business decision based on the potential risk of working with a specific analytics provider. The goal of an assessment is to understand and remediate any risks associated with a specific processing activity. Once you do an assessment, if the scope of that activity changes or new legislation comes into effect, new assessments should be done for that processing activity.

Assessments are required for:

  • Risky data processing
    • Processing data that creates a significant risk to consumer privacy or security
    • Sensitive data
    • Minors
    • Targeted advertising
    • Selling/sharing personal information
  • Service providers, contractors, 3rd parties
    • Processing personal information
    • Data brokers
    • Adtech vendors

Note: It is unlikely that the larger providers such as Google will respond to your outreach based on the large volume of requests they receive. They do however post their standard PIA questions and data protection language on their websites for your review.

Automated Decision-Making

Automated decision-making or profiling means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location or movements.

Topic: Consent Requirements

Under CPRA, consent is required for the following:

  • Re-Opt-In for Sale After Previously Opting-Out
  • Participation in Financial Incentive Programs
  • Sale or Sharing of Personal Information of Minors
  • Secondary or Additional Use of Data

Opt-out is required for:

  • Automated decision-making (Profiling)
  • Cross-Context Behavioral Advertising (Targeted Advertising)
  • Sale or Sharing of Personal Information
  • Use of Sensitive Data
  • Processing of Personal Data – No consent required
  • Processing of Personal Data of Minors – No consent required

Note: A pre-checked box is not considered express consent.

Topic: Sephora judgment and Global Privacy Control (GPC)

It is likely that the Sephora result would have been the same under CPRA. The key violations that the Office of the Attorney General (OAG) called out included not disclosing that they share/sell data, not honoring Global Privacy Control (GPC) as a pathway to opt-out and failing to remediate the violations in the given cure period (30 days). All of this will follow suit under CPRA.

The California OAG outlined that Global Privacy Control (GPC) signals must be honored under the CCPA as “Do Not Sell” requests. The California Privacy Protection Agency (CPPA) takes the approach that “Opt-out Preference Signals” generally must be honored as a “Do Not Sell/Share” request and/or a “Limit the use of My Sensitive Personal Information” request. Businesses that do this in a “frictionless manner” may choose to not include links for do not sell/share and limit the use of my sensitive data requests. A frictionless manner as described in the draft regulations means:

  • Not charging a fee or other valuable consideration. not changing the consumer’s experience with the product or service offered, and not displaying a notification, pop-up, text, graphic, animation, sound, video, or interstitial content in response to the opt-out preference signal
  • Including in its privacy policy that it recognizes opt-out preferences in a frictionless manner
  • Ensure the signal also effectuates opt-outs of any offline sales/shares
  • The draft regulations do not address the technical specifications for opt-out preference signals

WireWheel’s UPCP solution can persist a user’s choices across multiple different channels that can significantly reduce, or eliminate, the amount of prompts given to an end-user.

Employee Data Subject Access Requests (DSARs)

The Employee/HR exemption expires under the CPRA effective January 1, 2023. The exemptions that will expire include the personal information of job applicants, employees, owners, directors, officers, and independent contractors in the context of an individual’s employment or application for employment, and to personal information reflecting written and verbal communications where a consumer is acting in a business-to-business commercial transaction. They also apply to personal information collected by a business for emergency contact information and personal information necessary for a business to retain and administer employee benefits, provided the information is used only for those purposes.

Businesses must develop internal and external policies and procedures for accepting, verifying, and responding to employee requests to access, correct, and delete personal information collected on the employee. They also will need to analyze whether they are “selling” or “sharing” employee personal information and, if so, allow employees to opt-out of the same. Finally, businesses will need to consider whether they are collecting sensitive personal information as the CPRA defines the term and, if so, whether they must provide employees with the right to limit the business’ use of such sensitive personal information.

Employee access requests will prove to be especially sensitive and challenging as they can be a precursor to litigation. Businesses should treat any such requests like discovery requests in litigation and ensure that the information provided is limited to the statutory requirements, reflects a complete search of company records, and that any necessary redactions are made.

Additional things to look at include:

  • Generally the same rules apply to employee-based marketing as consumer-based marketing. All consents and preferences must be honored.
  • Based on the draft regulation language independent contractors are likely to be considered part of your workforce. CPRA does have a 12 month lookback on data. To date, there has been no specific information on if it applies differently for employee data.
  • Workspace and work equipment browsing, texting, email or any other repository of personal information will likely be in scope, notwithstanding sensitive information, information about other employees discovered in the request or information tied to litigation.
  • There is no specific guidance on whether a dedicated section or a separate privacy notice for employees.
  • It is possible that certain CA employment legislations may preempt CPRA and that certain rights such as opt-out of sale and limit sensitive information may not apply in an employment context. This will need to be contemplated and clarified by the California Privacy Protection
  • If HR Diversity, Equality & Inclusion (DEI) surveys include personal information “infers characteristics” it would likely be in scope.

Enforcement Scope

CPRA goes into effect on 1/1/23. Enforcement begins 7/1/23. CCPA however is in effect and being enforced. Currently under CPRA there is a 30 day cure period after being notified of the violation to remediate. This cure period will sunset under CPRA. CCPA will be enforced by the California Office of the Attorney General (OAG) until December 31, 2022. CCPA will be enforced by the California Privacy Protection Agency (CPPA) effective January 1, 2023.


CPRA requires opt-in consent for a few specific use cases. However, for many companies, there will be differences between their CCPA and GDPR compliance plans. They may have different systems, processes and teams involved with the collection and processing of data between the US and the EU. It’s recommended that privacy teams analyze their data flows and their ability to support requests from data subjects, particularly in the employee space given the upcoming requirements in 2023.

Sensitive Personal Information (PI)

The CPRA defines “sensitive personal information” as personal information that reveals (a) consumer’s Social Security or other state identification number; (b) a consumer’s account log-in, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account; (c) consumer’s geolocation; (d) consumer’s racial or ethnic origin, religious or philosophical beliefs, or union membership; (e) the contents of a consumer’s mail, email, or text messages, unless the business is the intended recipient of the communication; and (f) consumer’s genetic data.

We believe that this includes:

  1. The last four digits of the Social security number
  2. IP addresses and DeviceIDs
  3. Biometric information from CCTV images if used in the context of facial recognition or other identification purposes

Targeted/ Cross-Contextual Advertising

Targeted advertising is a type of advertising whereby advertisements are placed so as to reach consumers based on various traits such as demographics, purchase history, or observed behavior. Cross contextual behavioral advertising may leverage automated decision making but they are not synonymous. Automated decision making would also include example activities such as making credit worthiness decisions, profile based pricing, or certain aptitude tests.



Any information or materials that WireWheel provides, including but not limited to presentations, documentation, forms, and assessments, are neither legal advice nor guaranteed to be accurate, complete or up-to-date.

Participants are encouraged to seek the advice of licensed attorneys regarding any legal compliance or other legal matters related to the matters discussed or presented in this webinar and the related materials. The information and materials provided in this webinar are not intended as legal advice, and participants should not rely on them as such.

  • Privacy Law Update

Privacy Law Update: October 3, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!

Newsworthy Updates

White House Executive Order On Trans-Atlantic Data Privacy Framework Imminent

U.S. President Joe Biden is expected to publish an executive order concerning a new agreement on EU-U.S. data flows as early as Oct. 3, Politico reports. According to individuals involved in negotiations, the order will cover new legal protections over personal data access and use by U.S. national security entities. Principles for necessity and proportionality in relation to government surveillance activities are included in the order. Once the order is published, the European Commission will begin a ratification process that could take as long as six months to complete.

Why Closer Collaboration Between CPOs and CISOs Benefits Everyone

If we’re to be more proactive in identifying and preventing privacy and security risks, CPOs and CISOs must work together now more than ever. Security teams can’t protect personally identifiable information (PII) like names, Social Security numbers, home address, phone numbers and personal email addresses if they don’t understand what and where the information is; and privacy teams can’t exist in a company without the security controls in place to protect PII.

California Passes Stringent Kids’ Privacy Rules

Continuing its push as the nation’s first-mover on privacy, California has passed a bill that will require potentially significant new privacy commitments from online services that are “likely to be accessed” by children under 18. Covered companies have until July 2024, when the law takes effect, to assess their practices and come into compliance. In addition, implementing regulations due in January 2024 will give specifics on compliance.

CPPA Board Chair Doubles Down On Proposed American Data Privacy And Protection Act Opposition

In an op-ed for The San Francisco Chronicle, California Privacy Protection Agency Board Chair Jennifer Urban reiterated the agency’s position on how the proposed American Data Privacy and Protection Act would “undermine” Californians’ privacy rights and businesses’ “ability to confidently invest in more privacy-protective practices.” Urban said companies “may be understandably confused about how to invest if Congress overturns this existing guidance” under the California Consumer Privacy Act. She also noted how federal preemption would discontinue states’ ability to “experiment more nimbly” with legislation and react to emerging trends.

Data Privacy Can Give Businesses A Competitive Advantage

Data privacy isn’t just about compliance – it’s turning into a marketing and operational advantage for many businesses. Data privacy can give businesses a competitive advantage. Staying GDPR compliant gives companies an advantage over rivals as they are beginning to forge more trusting customer relationships which they fully expect will deepen loyalty and drive up the bottom line, the General Data Protection Regulation (GDPR) is a challenge, but strong data privacy opens up the opportunity for strong advantage over the competition, such as improved customer loyalty and more efficient operations.  The negative headlines around GDPR — such as Amazon‘s fine earlier this year, the largest issued of its kind to date — can encourage businesses to see compliance as a burden. The truth is, it can be an opportunity to win and retain new customers if you can turn respect for consent and protection of privacy into competitive differentiators.

Privacy Legislation

California: On Friday, September 23, the California Privacy Protection Agency held a board meeting to discuss various administrative and rulemaking topics. As expected, there was no announcement on delaying either the CPRA’s enforcement or effective dates; however, Board Member Le suggested that (1) the Agency could request from the legislature the ability to provide more direct guidance to businesses (without running afoul of restrictions on ‘underground rulemaking’), or (2) the Agency could promulgate a regulation expressly recognizing that a delay in finalizing the regulations is a “factor that the Agency may consider” when deciding whether to initiate an enforcement action or offer an opportunity to cure. [Note that the California legislature is currently out of session]

There was also significant discussion of the rulemaking process, particularly the procedural complications and hurdles that will be raised in the interaction of both the California APA and the Bagley-Keene Open Meeting Act. Executive Director Soltani urged the board to give a strong signal on the timeframe for meetings to advance the draft regulations, mentioning October and November (suggesting to us that the Agency may still hope to finalize initial draft regulations by end of year). Soltani further stated that staff is “burning the candle at both ends” working on the rules and that “there will likely be quite a number of changes [to the draft regs] in response to comments.”

The CPPA has also posted the public comments that it received in response to its initial draft implementing regulations. There are 102 total comments spanning well over 1,000 pages.

Michigan: On Tuesday, September 27, Senator Bayer (D) and 8 Democratic co-sponsors introduced SB 1182, the “Personal Data Privacy Act.” While this comprehensive privacy bill generally follows the ‘Virginia-model’, it includes a data broker registry and provides for a private right of action that, similar to ADPPA, would require prior written notice to the party alleged to be in violation. While the bill is unlikely to move this late in the session of a Republican controlled chamber, we are interested to see whether it represents a new trend of state privacy proposals incorporating elements from ADPPA.

New York State: On Friday September 23, Senator Gounardes (D) introduced S9563, “The New York Child Data Privacy and Protection Act”. While the Act contains some similarities to the recently enacted California Age Appropriate Design Code, it would go much further in numerous respects, including:

  • Not requiring age estimation, but instead applying to all “online products” targeted towards (accessible to and used by) child users.
  • Requiring an expansive risk assessment for any new online product (including services, features, or platforms) be submitted to, and approved by, the state AG before the product can be made available to consumers.
  • Empowering a new AG Bureau to ban autoplay videos, in-app purchases, push notifications, prompts, or other features for particular products that it chooses
  • Requiring online products to prioritize civil and criminal subpoenas and criminal warrants when a child user has been a victim of a crime.
  • Creating a private right of action.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo