Free, SPOKES Privacy Virtual Conference June 22 and 23

REGISTER NOW

Blog

  • Privacy

Practical Tips for Building your Privacy Operations

Privacy is a dynamic, cross-functional discipline that, no longer confined to the arcana of lawyers and cybersecurity experts, underpins nearly every aspect of the employer-employee and employer-consumer relationship. Nearly everyone in the organization has a direct or privacy-adjacent role: marketing, technology, HR, research, engineering, sales, product development, legal, et al.

To codify and operationalize privacy at this scale – regardless of the size of your organization – requires talent. And the competition for that talent is fierce. In short, the privacy field is booming, and demand far outstrips supply.

The resulting mobility among privacy professionals means that many who are responsible for operationalizing privacy are new to their companies and face the dual challenge of acclimating to a new culture while needing to improve privacy operations with near-term impact as well as laying the foundation for long-term maturation.

On January 26th, WireWheel CPO, Rick Buck (who is a privacy veteran with decades of experience) moderated an IAPP Webinar, Practical Tips for Building Your Privacy Operations.

Rick was joined by Rebecca Shore who recently joined Albertsons Companies as VP and CPO and Eric Paulson, Grant Thornton LLP Manager of Advisory Services and Cyber Risk Privacy and Data Protection.

Paulson presented the concept of “Privacy Assurance” in operationalizing privacy followed by very practical tips from Shore on how to do this when you are new to a company that may not even have a privacy program yet in place.

Privacy Assurance

“Privacy is not going to be the responsibility of one department or function within the company. It’s going to require cross-functional collaboration and buy-in from them to sustain compliance and manage risk,” says Paulson.

“It is through what Grant Thornton calls a Privacy Assurance Program – the second line of defense behind privacy operations – that provides a governance model,” noting that the three lines of defense model  – 1. privacy ops, 2. privacy assurance, and 3. Audit – are adopted from the financial industry.

The role of privacy assurance is to monitor regulations and risks and set policy guidelines. It also provides a framework for the internal audit team to independently assess compliance.

In all of this, collaboration is going to be a key part. Working with the legal team, privacy can make sure the organization’s meeting its obligations, setting the risk appetite, establishing roles and accountability for the framework, and leaving it to the business to align to those objectives and provide the customer experience.

The added benefit is creating a communication platform that creates and promotes privacy by design.

— Eric Paulson, Grant Thornton LLP

Privacy assurance in practice

To effectuate privacy assurance Grant Thornton offers three pillars on which the program rests:

  1. A Privacy Control Framework: This is “how the organization understands, rationalizes, and operationalizes regulatory requirements…and align to the company’s risk appetite.”  (Here an organization can leverage existing frameworks like NIST or ISO.)
    Importantly “business users are going to be responsible for knowing how to meet those objectives….business owners don’t really want to be told how to implement privacy,” opines Paulson. “They want to be provided with the main objectives and really work to define a process that’s going to work best for them.”
  2. Compliance Monitoring: It’s critical that both privacy and the business operations team are continually reviewing requirements [against] their business objectives and a) making sure that all those controls are still [functioning properly] and b) the control objectives are being met. “One way to support that is through a self-assessment process.”“A critical piece is going to be evaluating emerging regulations…to quickly identify gaps.”
  3. Key Risk Indicators (KRIs): “KRIs help you evaluate the program and look for maturity opportunities,” notes Paulson. “These could be the number of DSARs coming in or the number of PIAs that need to be completed,” for example. “They help make sure that are “within compliance and the objectives that the organization has set.”

Importantly, KRIs helps support the business case for program budget, investment, and maturation.

In short, opines Paulson, “understand your regulatory compliance needs, outline your responsibilities…and don’t hinder your program by developing a framework that is very singular. Make sure it can be modified easily for future impending regulations.” Then, you can begin to “identify improvements and “embed privacy assurance…and enhance through automation.

The first 30, 60, 90-day plan

Albertsons’ Shore (who notes that she is speaking on behalf of herself, not Albertsons) relates that she has just gone through a transition herself:

“It’s a pretty big shift to come into a new organization. I think the most important thing – and it’s really hard to do in the first 30 days, particularly because you’re just really wanting to run – is to learn the company and the culture,” which can be very different.

You’re walking in and either have an existing program or you’re the first privacy person they’ve ever had in house, whether you’re an attorney or not, that’s a pretty big shift.

[If there is an] existing team, you’re listening, learning, and trying to understand your team. If you’re leading a program, the really key part during that listen and learn piece is not necessarily jumping on change yet. You want to understand how to make change and where it’s meaningful.

You’re building out your brand and your persona within your organization.

—Rebecca Shore, Albertsons

The 30-day plan

“You don’t immediately dive in when you have an existing program,” continues Shore, “and earning a reputation as someone who’s not taking the time to understand how the culture and systems work.

And when there’s no privacy program at all?

“My first guidance is don’t panic. It’s okay. A lot of companies, particularly in the United States may not necessarily have an embedded team,” says Shore who recommends meeting with the cyber teams who likely have protocols you can build on.

“If you’re in an existing program, you probably see there are a charter, a mission, and values, while in a new program, you likely don’t have those things. So, start to think about what you want your program to be.

“You may be in a position that you have no budget or resources when you walk in…and I would say don’t have them [yet] but start to prepare for your resource discussion in the first 30 days.”

The 60-day plan

“At 60 days you’re gaining momentum. This is when you start to socialize your direction and gain buy-in,” says Shore.

“The Privacy Assurance Framework is what this looks like. How do you develop pieces of it that allow for changes in laws? You might see a particular point of view that you want to start to implement within the organization.”

I recommend you start thinking through branding. [It signals] the type of approach privacy will take to address change.

Then implement the small changes that have a big impact. Lay the foundation for the push forward. You have a lot of work to do to get buy-in across the board.

—Rebecca Shore, Albertsons]

  • Existing program: Socialize your direction and gain buy-in.
  • No privacy program: Create the program’s values and socialize. Prepare for the significant influx of questions.

“What kind of privacy program do you want to instill within the organization? Do you want to have a value associated with transparency and the ethical use of data? Is your company ready to embrace that concept? And how do you start to stand behind that?”

  • No budget or resources: Capture metrics (KPIs and KRIs) for those resource discussions.

“You may not have that budget or resources conversation within the first 60 days. But you want to be prepared. You want to start building the foundations of what your deck is going to be. And as you hit 90 days, you’re going to outline your strategy.”

The 90-day plan

If you have an existing program, you’re going to have strategic planning sessions with your team. If you don’t have a privacy program, I recommend you do a below-the line-exercise. Outline what’s possible in six months to a year. You’re not doing a three-year roadmap. You’re not ready. That’s just not where your program is or where your organization is. But you can solve for what the bigger projects are for six months to one year.

—Rebecca Shore, Albertsons

“And if you don’t have an existing technology (which is likely if you don’t have a privacy program),” suggests Shore, “you’re going to start to evaluate privacy technology vendors. Plan for your future technology enhancements, because you’re not going to be using Excel or MS Word documents for the next 20 years.

“At 90 days you need to be prepared for the resource conversation. It might be a few months before you feel like it’s the right place to have it, but your deck for resources should be complete.” To do this you must have metrics.

Key Takeaways for the first 90 days

  • Focus on the most impactful thing that you can do to help in the near term.
  • Build out the concepts that you can use in the future in those foundational first days.
  • Understand, prioritize, and distinguish the “must-haves” from the “nice-to-haves.” And understand what you “can have:” What do you really need now, next month, next year…?
  • Understand what can truly be achievable.
  • Establish your cross-collaboration “governance group.” One or two people alone can’t make it happen.
  • Gather the KPIs and KRIs and operationalize those important key controls to start building out your program.
  • Understand the company culture, and over time, the data flows, systems, and processes

Download our free guide on how to develop a data privacy program in 90 days

Download Guide
  • Privacy Law Update

Privacy Law Update: February 14, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


Newsworthy Updates

FTC Sees Growing Pressure for Data Privacy Rule as Pick Stalled

The Federal Trade Commission is under mounting pressure to kick off long-anticipated federal rules protecting personal information collected by companies amid delays with President Joe Biden‘s pick to fill the deadlocked agency’s vacant seat. Advocacy groups are looking to the commission for policy action amid years of stalled negotiations in Congress over a national privacy law and a growing patchwork of state laws imposing new compliance obligations on companies.

3 Data Privacy Trends To Watch In 2022 And Beyond

After years of consumers happily surfing the web and using apps with reckless abandon, data privacy became a “thing” in 2021. Thanks in part to Apple’s efforts to provide consumers with greater knowledge about and a greater say in how their data is used, people got a wake-up call when their favorite apps and websites started asking them to agree to cookies or to make their privacy preferences known — a reminder that companies track their online behaviors.

Following Austrian DPA, CNIL Rules Google Analytics Violates GDPR

Just weeks after the Austrian Data Protection Authority’s ruling that Google Analytics use violates the EU General Data Protection Regulation, France’s data protection authority, the Commission nationale de l’informatique et des libertés, has reached a similar decision. The CNIL ruled data collection and transfers to the United States using Google Analytics “are illegal,” also stating its investigation extends to other tools that result in the transfer of data to the U.S. IAPP Staff Writer Jennifer Bryant has the latest reaction to the decision.

A Turning Point For Privacy Laws In Israel

Israel is recognized around the world as an important technology hub. It is home to hundreds of large, multinational tech corporations operating substantial local research and development, sales, and management activities. It is the incubator for thousands of startups, dozens of which are publicly traded unicorns. And it boasts an environment where innovation is fostered, growth achieved and the commerce of modern industries flourishes on a global scale. Because data is a core asset in many tech ventures, Israel is also a jurisdiction worth monitoring from a data protection perspective — and 2022 will likely be a turning point for privacy laws in Israel.

Privacy Legislation

This week four new comprehensive privacy bills (Arizona, Connecticut, Wisconsin, and Iowa) were introduced. There are now approximately 40 pending bills across 23 states.

Arizona: On February 7, Representative DeGrazia and 5 other Democratic Sponsors introduced HB 2790. This is a fairly unique privacy bill that includes consumer rights to “restrict processing of personal data”, “object to the processing of personal data”, and would place substantive limits on certain profiling decisions. The Act disclaims any private right of action arising under its terms.

Connecticut: On February 9, SB 6 an Act “Concerning Personal Data Privacy and Online Monitoring” was introduced by 19 Senators. Committee staff informs us that formal language has not yet been added to the bill but will come soon. We understand that the forthcoming bill is the product of Senator Maroney’s (D) summer working group with stakeholders.

Iowa: SF 2208 was introduced by Sen. Nunn (R) and HSB 674 was introduced by Rep. Lohse (R) on February 8. On February 9th the House bill received a favorable Information Technology Subcommittee recommendation by a 2-0 vote. These bills are essentially the VCDPA.

Wisconsin: SB 957 was introduced on February 9 by 4 Republican Senators. This bill is essentially the VCDPA and already has a House companion (AB 957). Separately, SB 977 was introduced on February 9 by 5 Democratic Senators. This is a CCPA-style bill.

 

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo

 

  • Privacy Law Update

Privacy Law Update: February 7, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


Newsworthy Updates

Data Remediation and Its Role in Data Security and Privacy

Optimized data management and data security are critical within any organization. However, given the sheer amount of data that businesses collect daily, this is easier said than done. When data accuracy, quality, storage and security suffer, it can lead to poor decision making, data breaches and non-compliance issues. This is where data remediation becomes necessary, as it helps businesses clean up, organize, and efficiently move their data to a secure and clean environment.

Privacy budgets rise as businesses see consistent ROI

Companies believe effective privacy management improves trust, transparency, and provides a return on investment, scotching the notion that data protection is a compliance burden and additional cost. According to technology vendor Cisco’s “2022 Data Privacy Benchmark Study,” published Jan. 26, 83 percent of the more than 4,900 security professionals globally who responded to the survey said privacy laws have had a positive impact on their business. Another 90 percent said they would not buy from an organization that does not properly protect its data, while 91 percent indicated external privacy certifications are important in their buying process.

Belgian DPA fines IAB Europe 250K euros over consent framework GDPR violations

The Belgian Data Protection Authority fined IAB Europe 250,000 euros Wednesday, ruling its Transparency and Consent Framework, used by much of the advertising industry in the European Union, does not comply with several EU General Data Protection Regulation provisions. Through data processing under the TCF, which “facilitates the management of users’ preferences for online personalised advertising,” the DPA found IAB Europe acts as a data controller and can be held responsible for potential GDPR violations. The authority also ruled IAB Europe did not establish a legal basis for processing and failed to appoint a data protection officer, conduct a data protection impact assessment, or maintain a register of processing activities. The DPA also argued it is difficult for users to “maintain control over their personal data” under the framework, as the information provided is “too generic and vague to allow users to understand the nature and scope of the processing.”

E-Commerce Businesses: Five Data Privacy Practices For Gaining Customer Trust In 2022

Virtual retail platforms have experienced impressive growth over the past few years, and the pandemic moved many more shoppers online. In 2020, U.S. consumers spent more than $790 billion online, representing a staggering 32% year-over-year increase. While we could easily assign these changes to the Covid era, there’s plenty of reason to believe they signal a more permanent development. Those who have learned to shop online are likely to continue doing so, at least to some degree. According to McKinsey, “Consumer intent to shop online [post-pandemic] continues to increase.”

Privacy Legislation

Alaska: The “Consumer Data Privacy Act” (HB 159 / SB 116), a CCPA-style bill with a potential ‘backdoor’ private right of action introduced in 2021 at the request of Governor Dunleavey has slowed down. Following industry opposition and an announcement from Labor & Commerce Committee Co-Chair Fields (D) that he is considering an amendment that would limit the scope of covered entities, multiple scheduled hearings on the bill were either cancelled or addressed other matters. A House Judiciary Committee hearing is currently scheduled, pending referral, for February 7.

Hawaii: The “Hawaii Consumer Privacy Act” (HB 2051) received a hearing in the Committee on Higher Education and Technology on February 2nd. Following testimony in opposition from several industry groups, the Committee recommended ‘deferral,’ which we understand to mean that the bill is likely dead for this session. The 112-page bill had appeared to closely follow the CPRA but without any private right of action. There are still several live VCDPA-style bills pending in the Hawaii legislature.

Indiana: SB 358 passed the Indiana State Senate on February 1 by a 49-0 vote. The Act very closely tracks the VCDPA but includes a couple of wrinkles such as adding protections for trade secrets and an explicit carve out for “aggregated” data. The Act would also give businesses discretion to provide a “representative summary” of data in response to consumer access requests and limit the requirement to provide information in response to DSARs to once per year. A late amendment to the bill further removed the “Consumer Privacy Fund” for financing AG enforcement and corrected multiple instances where the bill referred to “HIPPA.”

Kentucky: SB 15 picked up a second co-sponsor, Senator Alvarado (R). The Act appears informed by the VCDPA/CPA frameworks but contains distinctions such as consumer rights to opt out of “tracking,” unique consent standards, and transparency obligations for the locations where data will be stored by third parties. The Act would also create a limited injunctive private right of action for particular violations.

Massachusetts: The Boston Globe reports that the “Massachusetts Information Privacy and Security Act” passed through the Joint Committee on Advanced Information Technology, the Internet and Cybersecurity on February 1. The Committee’s website has not been updated, but we are attaching a link to what we understand to be the latest text (S.46/H.142) at the bottom of this email. Many unique elements of the Massachusetts proposal (such as annual opt-in consent requirements, a broad private right of action, and ‘duty of loyalty’) appear to have been removed. The Act now trends towards CPRA-style consumer rights, Colorado-style business obligations, and a unique approach to designating “lawful bases” for processing information. FPF is continuing to analyze the latest text and welcomes input.

Nebraska: LB1188, a version of the ULC’s “Uniform Personal Data Protection Act” introduced by Sen. Flood (R) has been scheduled for a hearing on February 28.

Virginia: Multiple proposed amendments to the VCDPA are marching through their respective committees:

Amendments to allow controllers that collect consumer data indirectly to treat deletion requests as opt-out requests (HB 381) (SB 393) have passed two House committees by votes of 8-0 and 21-0 and a Senate General Laws and Technology Subcommittee by a 14-0 vote An amendment from the original sponsors of the VCDPA (HB 714) (SB 534) that would add “political organizations” to the nonprofit exemption; allow an opportunity to cure only where “deemed possible” by the AG; permit the AG to recover “actual damages” sustained by consumers; and replace the “Consumer Privacy Fund” with the existing “Revolving Trust Fund” passed a House Subcommittee by an 8-0 vote. An amendment to exempt 501(c)(4) organizations from the VCDPA (SB 516) (HB 552) passed the Senate Commerce and Labor Committee by a 15-0 vote. An amendment (HB 1259) that would provide that “sensitive data” under the Act “shall only be considered sensitive data if used to make a decision that results in a legal or similarly significant effect for a consumer” has been assigned to a House Commerce and Energy subcommittee.

Washington: The Washington Foundational Data Privacy Act (HB 1850) from Reps. Slatter (D) and Berg (D) received a second hearing in the House Civil Rights & Judiciary Hearing on February 2. The Committee narrowly voted (9-8) to refer a substitute bill to the Appropriations committee. Significant changes in the substitute bill and amendments adopted during the committee hearing include: Expands the definition of targeted advertising to include use of data from affiliate websites. Modifies the right to correct to remove discretion for controllers to take into account the nature and purposes of data processing. Directly roots the bill’s private right of action in Washington’s consumer protection act. Delays the effective date for many of the act’s provisions until July 31, 2023. Directs the Consumer Data Privacy Commission to undertake rulemaking on specific topics. New clarifications of the division of enforcement activities between the Consumer Data Privacy Commission and the State AG. In short, HB 1850 would create opt out rights over targeted advertising, data sharing, and profiling, which may be exercised by user-enabled global privacy controls. The bill would further require annual registration of covered entities and data protection assessments and create a Consumer Data Privacy Commission (with rulemaking authority).

Wisconsin: On February 3, Rep. Zimmerman (R) introduced AB 957 along with 20 Republican and 1 Democratic cosponsors. This bill is essentially the VCDPA.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy Law Update

Privacy Law Update: January 31, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


Newsworthy Updates

The origins and purpose of Data Protection/Privacy Day

Friday, Jan. 28, is Data Protection Day (in the EU) or Data Privacy Day (in the U.S.). But how did this day originate? The history of January 28. The day was initiated by the Committee of Ministers of the Council of Europe in 2006. Often synonymous and confused with the EU, the Council of Europe formed in the wake of World War II. Resolutions from the Council of Europe on the privacy of individuals preceded and even served as partial catalyst to the earliest national data protection laws in Austria, Denmark, France, Germany, Luxembourg, Norway, and Sweden.

IAPP launches ‘Global Privacy Law and DPA Directory

To stay apprised of the hundreds of data protection laws that exist across the globe, the IAPP Westin Research Center announces the creation of a single indexed resource of these laws — the “Global Privacy Law and DPA Directory.” The interactive map identifies those countries with data protection laws and within each country’s listing, you can link to a resource containing the data protection law, the data protection authority and relevant IAPP resources (if available).

EDPS reflects, looks forward prior to Data Privacy Day

European Data Protection Supervisor Wojciech Wiewiórowski wrote a blog post discussing the state of EU data protection ahead of Data Privacy Day 2022 celebrations 28 Jan. Wiewiórowski noted steps forward through Convention 108 and the EU General Data Protection Regulation, but added there is “still more to strive for, to achieve for data protection to work for all of us in Europe’s digital future.” Notably, he called on EU data protection authorities to “work together to promote the consistent application of data protection rules according to the EU’s values and principles.”

#DataPrivacyWeek: Prioritize Data Protection to Safeguard Consumer Privacy

The collection and use of personal data have grown at an unprecedented rate in recent years, accelerating even faster during the pandemic amid the digital shift. Heather Paunet, senior vice president at Untangle, noted: “In today’s connected era, people disclose personal data during dozens of daily interactions, from online shopping, healthcare portals, social media, wearable devices to streaming services. This data is used to create profile-specific experiences across a multitude of devices and mediums, resulting in personalized, effective marketing campaigns.” Unfortunately, this information is also viewed as highly valuable by those with nefarious intentions, from cyber-criminals motivated by financial gain to governments wishing to use this data as a means of surveillance and control.

Data Privacy Week: Organizations see strong ROI on privacy spending, says Cisco

“Privacy’s Return on Investment (ROI) remains high for the third straight year, with increased benefits, especially for small-to medium-size organizations and higher ROI for more privacy-mature organizations,” says the report, which was based on a survey of 4,900 respondents in 27 geographies who indicated they are familiar with the privacy processes at their organizations. Asked to estimate the financial value of the benefits from their privacy investments, the average estimate was up three per cent from US $2.9 million last year to US$3 million in 2021.

Privacy Legislation

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy

Privacy Predictions for 2022

Finalizing the latest – always well-attended – biannual Spokes Technology Privacy Conference, held December 7-8, 2021, WireWheel CEO and Founder Justin Antonipillai brought together leading privacy professionals to offer their privacy predictions for 2022.

The panelists obliged Antonipillai, often humorously. But more than a good-natured placing of bets, predictions can offer insights into what is top of mind for those who grapple with privacy in practice, theory, and importantly, in their efforts to position their brands, clients, and their thinking, for the future.

This diverse panel included:

  • Jocelyn Aqua, Governance, Privacy and Ethics Leader, PWC;
  • Stacey Gray, Senior Counsel, Future of Privacy Forum;
  • Andy Dale, General Counsel & Chief Privacy Officer, Alyce, and
  • Omer Tene, Partner, Goodwin (formerly IAPP VP and Chief Knowledge Officer)

The round-robin style session can be viewed here

Privacy Predictions Round 1: Quick hits

Stacey Gray: In 2022, we will not see a federal comprehensive data protection law passed in the U.S.

Andy Dale: There will be multiple privacy tech IPOs and more privacy tech vendors will enter the space. There will be some consolidation and M&A as well.

Jocelyn Aqua: This is the year that India will have a national data protection law. (Even though every year they think they’re going to, this is the year I think they are.) Between that and China…it will impact  a lot of companies that have offsites – or do business with –both of those countries.

Omer Tene: The biggest privacy penalty next year is going to come from the United States. And, if you want to hear whether it’s going to be from the FTC, a State Attorney General, a California agency, or from a BIPA class action, wait for my second prediction.

Justin Antonipillai: In 2022 will have our first in Federal Chief Privacy Officer in the United States.

Privacy Predictions Round 2: Some bold predictions

Cookies are dead when they’re truly, actually, really, finally, totally, dead.

—Andy Dale, Alyce

Stacey Gray: Apple will release iOS 16 and it will deprecate the IDFA and ban behavioral advertising – only for third-party apps. (I threw that one out there to be provocative.) I’m not sure about it, but I do think we’ll see interesting iOS 16 updates.

Andy Dale: Google will further delay cookie deprecation. There are parts of the ecosystem that are not ready yet. We’ve had this conversation with lots of different people, and I believe cookies are dead when they’re truly, actually, really, finally, totally, dead. And I just don’t see that yet.

We can talk about the GDPR and how the GDPR brought different kinds of data into the definition of personal data, but at the end of the day, it is a device ID it isn’t personal in the same way.

Jocelyn Aqua: There’s going to be more churn in the EU between the ePrivacy Directive (which hopefully will be regulation at some point), all the market and digital services, and especially the data governance act, where I think the data transfer mechanism will be for non-PII where there is going to be adequacy needs.

There’s going to be lots of other swirls over the next year that we’re still trying to flesh out what that will mean.

I think data governance is moving ahead, it looks like ePrivacy regulation is moving ahead, and AI regulation is getting a lot of traction making companies nervous in the United States. There’s a lot to think about, and a lot of impact extra-territorially for the U.S.

Omer Tene: I may be proven wrong within a couple of weeks, but I say Israel is going to be the first country to lose its adequacy decision.

Israel was deemed adequate about a decade ago. At the time we worked with the government on the application, and we made certain commitments that we haven’t delivered on, and [since then] the requirements were also tightened via Schrems I and II.

When we negotiated adequacy, the government access issue wasn’t even considered, certainly not front and center, like it is today. And as you know, much like the United States, Israel has a very robust national security apparatus where these issues do come up.

Justin Antonipillai: My second prediction is that California will be the first state to be declared adequate under the EU policy. They’re not going to complete the finding in 2022 but here’s how we get there:

  1. You can be a region and found adequate (it doesn’t need to be a country).
  2. I have confidence that the Europeans and the U.S. are going to reach a new privacy shield so that will have an adequacy finding as to the U.S.
  3. This U.S.-EU transfer mechanism will enable California to seek adequacy because the national security issues will not be on the table.
  4. California will have a very strong argument between CCPA and CPRA that they’re an adequate regime.

In that case, I retract my last prediction! If California is adequate, Israel is double adequate! jokes Tene.

Gray, striking a more serious tone, opines that just on the merits, the CCPA and CPRA are not great models. They’re not very strong laws. They lack individual redress for most cases, all of them related to privacy.

Final 2022 Predictions

Now it’s the Metaverse.

—Omer Tene, Goodwin

Stacey Gray: I think we will see five or more additional States passing comprehensive data protection laws. But I don’t think any of them will significantly diverge from current frameworks or include a private right of action.

  1. Maryland
  2. Oklahoma
  3. Ohio
  4. New Jersey, and wild cards
  5. Alaska and Florida

Andy Dale: My current customer DPA [data processing agreement] with SCCs attached as exhibits is 38 pages. In 2022 it will go up.

Jocelyn Aqua: There’s been a lot of activity in the SEC for cyber security, the CFPB is making so many inquiries; I think there’s just going to be a lot of enforcement or decisions – all privacy adjacent or directly on privacy – that will impact our companies.

The inquiries are from so many different organizations and federal government agencies, All of them ask about AI, data brokers; actions against big tech; focused on privacy for disadvantaged people; lots of movement, even if there’s no federal privacy law. And it’s not just data breaches. It’s data misuse and lack of transparency. Both here and in Europe.

Omer Tene: The buzzwords of the day, in our field changes every couple of years. Big data was the theme about a decade ago; then Cloud; then Ai; now it’s the Metaverse.

[With tongue firmly planted in cheek, Tene predicts that] at some point in 2022 the most popular application in the metaverse will be an immersive privacy policy where you can step into the privacy policy, engage with the standard contracts and the DPA and Andy Dale’s “38 pages” and fall asleep calmly.

Andy Dale: There is a need for innovation at the point of collection. And at the point of meeting the consumer. There’s very little innovation in that conversation with the consumer…I do think there’s a lot of room for innovation in how companies talk to consumers about privacy.

“I really hope that in 2022 we have some of that: Beyond the scope of the CCPA asking you to make the privacy policy more readable or more accessible.”

Justin Antonipillai: There’s going to be a big Supreme Court decision this year or big CERT granted in the following year that effectively guts the ability of the FTC to enforce privacy laws under unfairness authority. I think even before the current makeup of the Court it was very skeptical of unfairness authority: There have been a number of decisions undermining it.

It will put Congress into a position where they either have to grant actual enforcement authority for something in privacy or block the FTC from doing it.

¹ Quotation marks have been omitted and comments lightly edited for readability.

  • Privacy

Knowledge Creation and Data Protection

How an enterprise data strategy enables both

“When we talk about this idea of knowledge creation, we’re really focused on what it means to do research in a commercial or corporate setting different from scientific research, typically conducted in academia, or in other words,” says COO and Sr. Strategist of The Information Accountability Foundation (IAF), Barbara Lawler.

Importantly, “when we think about it in a data protection context, very often data protection and privacy laws don’t specifically address, in an effective or practical way” what we mean by knowledge creation.”

Lawler moderated a discussion during the December 2021 Spokes Technology Privacy Conference concerning the tension between the importance of knowledge creation or “thinking with data,” and the concepts of data privacy and security. The conversation, Knowledge Creation and Data Protection: How an enterprise data strategy enables both included JoAnn Stonier, Chief Data Officer, Mastercard, and Martin Abrams, the Executive Director and Chief Strategist of the IAF.

Thinking with data vs. acting with data

Simply put, in the arena of data privacy and protection, knowledge creation refers to the process of using data, including data pertaining to people and their actions, to create new insights. However,

Commercial research isn’t usually considered as an explicit legal basis or legitimate interest for data processing. And while scientific research, has some support in the law, it is subject to differing interpretations. But it tends to resolve down to a pretty narrow interpretation of what happens in an academic or university setting conducted by specific individuals and oversight by some type of review board process.

–Barbara Lawler, IAF

But as Lawler notes, “If knowledge creation is about thinking with data or discovering insights in a commercial or corporate setting, knowledge application is the next step: taking action.” For example: Developing a scoring system is thinking with data. Applying that score – for lending or hiring decisions –is the acting with data.

The risk profiles are different. And there is friction when policymakers implementors are not clear on the differences. (A lot of the debate we are seeing is about acting with the data more than thinking what data).

Data protection and data ethics

Mastercard’s Stonier observes that “regulations don’t come at these issues head-on” and as organizations innovate, they “are pretty much left on their own to figure out what is ethical, responsible data innovation. What does that look like from a practical application perspective?”

Data governance and data responsibility, principles are really the backbone of our program at MasterCard and from there, we develop our, principles and guidelines, and ultimately our practices and controls. Then, In the product development process, how we apply those as we actually build out products and solutions. We think about it as part of the design process.

—JoAnne Stonier, Mastercard

Ultimately, when “thinking about what constitutes responsible data practices – as opposed to just privacy or security – comes down to what you are trying to effectuate, “ opines Stonier. And “at the center is the individual. Almost everything that you do in an organization, even in a B2B2C company like Mastercard, the choices made are going to be impactful to the individual.”

Responsible data strategy combines effective data governance, scalable tools, and knowledge.

Responsible data strategy first principles

Mastercard begins with what Stonier calls “a set of first principles:”

  1. Individuals own their data;
  2. Have the ability to control their data;
  3. The ability to understand how their data is being used; and
  4. Should be able to understand the benefits that they get from that data use.

“Individuals obtaining an understand the benefits of their data use is where I think it sometimes comes off the rails a little bit,” says Stonier. “I don’t know that all of us are transparent enough about how we’re using the data. And that’s the harder one.”

“We can try to have privacy notices, we can attempt real-time privacy notices, but that doesn’t always work. The question becomes, how do you incorporate transparency into a product design so individuals can have that understanding?”

Mastercard is known for privacy and security…but when you get beyond that it is really about being accountable for how we’re going to use the data and being willing to talk about it. To be transparent. And this requires integrity in the process – data quality, lineage, accuracy, completeness, consistency – and needs also to encompass innovation practices.

—JoAnn Stonier, Mastercard

“These principles really make a difference as you build practices for each different type of data set and each different context in which you use data. They must suffuse expertise, training, tools, and platforms, all the way through to the sales force. All of us have a role in innovation. We’re all at the table designing together. And when everybody’s part of a design team and you have diverse subject matter expertise, that makes a huge difference in getting to the right outcomes.”

The polarization challenge

IAF’s Abrams instructs on the genesis of this thinking and the current challenges. He notes that “a process took place beginning in 2009 with development – and organizations and regulators beginning to embrace – the essential elements of accountability based on active discussion. This reached a point where the concept of knowledge creation, thinking with data, and acting with data, not explicitly understood by individuals but essential for an innovative marketplace, was explicitly articulated.”

“Policymakers and regulators needed a sense that there was something that was beneficial to society and people that supports this concept of doing big data and the work in artificial intelligence.”

But this concept of that you could separate the risks that come with thinking with data – such as data security or understanding there’s bias related to data – are very different than the activities related to applying the data.

There was this process to differentiate the risks of developing something that is not personally impactful from the risks of making decisions that are personally impactful.

—Martin Abrams, IAF

“Since 2018 we’ve seen a polarization in the way people think about data, which creates risks for organization’s ability to develop and use data,” continues Martin.

“Back in 2016 we would think about the individual’s place in this environment, and it was a question of knowledge and autonomy. It was about transparency and the ability to understand the decisions individuals were making. Today we see regulators who say folks don’t know how to do legitimate interest balancing. We won’t trust the private sector to do research. You have to have complete explicit consent in order to think with data.”

“It’s about the concept of individual sovereignty that is different from the concepts of autonomy. That I have an ownership right in the data that pertains to me.”

Individual sovereignty vs. knowledge creation

“If you’re putting incredible weight on individual sovereignty based on the perception that data extraction (the term used by the FTC),” says Abrams, “you have to shift the concept of what is trustworthy processing to make the balance between sovereignty and data innovation work and leads to a much more restrictive use of data.”

But, as Abrams notes, “If you’re a person in the EC who’s responsible for digital innovation, in the Canadian Government trying to push artificial intelligence as the new economic growth factor, or you’re in Hong Kong trying to create an innovative space, you say that knowledge creation and data innovation is much more important. And the ‘trustworthy path forward’ begins to shift:”

 

“If you’re thinking, what does it mean to do what Mastercard is doing, you have to think about it in terms of step by step. And the first step is every Organization has to engage in responsible knowledge creation. There is no way an organization can be successful today if they’re not using data to improve their products to serve consumers better or create societal value like new vaccines,” suggests Abrams.

“Even if you are the best at innovating in your own industry, the only way we will come up with the best innovations is through data partnerships and through methodologies, where you can do trusted data sharing,” insists Stonier. And “let’s face it data ecosystems are here to stay.”

Responsible data sharing and data practices to create kind of the knowledge that we’re discussing requires responsible practices and it does not mean sharing raw data and it does not mean sharing your sensitive data. It means coming up with practices that are responsible, accountable, transparent, and have a lot of integrity in methodology. There are all sorts of methods that should be explored and do meet the spirit of the regulation.

—JoAnn Stonier, Mastercard

Responsible, trusted, and answerable

Trust is earned in drops and lost in buckets.

—JoAnn Stonier, Mastercard

“To be trusted, organizations have to be responsible and answerable. This is explicitly part of almost every law that’s passed since the GDPR,” says Abrams. “Organizations, need to understand what they’re doing and the impact on all stakeholders that are impacted by the use of data, including research.”

To be answerable, organizations have to be proactive in demonstrating their accountability and stand ready to do so as Mastercard’s principles articulate. Frameworks based on risk assessment and effective data governance will enable beneficial data-driven innovation while protecting individuals and society from potential harms.

—Martin Abrams, IAF

“If we’re asking organizations to be innovative with data and we’re asking the public to trust that innovation,” opines Abrams, “we have to have enforcement agencies that go beyond enforcement to be effective oversight agencies. This requires a different set of skills to be able to determine whether when organizations say they have integrity in their process and the competency execute on those processes so, that in fact, they do.”

From Mastercard’s perspective, “we really need regulators to understand what good practice looks like and how we can actually demonstrate the steps responsible organizations are going through, and how we can even that playing field, so they are similar regardless of context,” says Stonier.

“Only then can we talk about fairness in comparing different organizations trying to solve the same thing. Then you can get to ‘is this fair,’ ‘is this ethical,’ within the same context.”

“I think we have a ways to go. Regulators need to understand that. That’s why I believe in these principles, these practices.”

  • Privacy

Implementing Privacy Assurance

Establishing a Second Line of Defense to Sustain Compliance

Lindsay Hohler, Principal Privacy and Data Protection at Grant Thornton LLP, and her colleague Eric Paulson a Manager in the Privacy and Data protection group presented the session Privacy Assurance: Establishing a Second Line of Defense to Sustain Compliance at the December 2021 Spokes Technology Privacy Conference.

There has been much discussion about the need for “frameworks” when operationalizing privacy. The session, sponsored Grant Thornton LLP, presents a detailed look at how organizations may effectively and practicably establish such frameworks that support scalable and sustainable privacy compliance and ultimately effectuate privacy by design.

What is the role of privacy assurance?

To sustain compliance and manage risks, practices really can look to a “three lines of defense:” 1) Privacy Operations; 2) Privacy Assurance; 3) Internal/External audit. Here we focus on that second line of defense, privacy assurance.

The role of privacy assurance in this is to monitor regulations, privacy risks, and establish policy guidelines to help the first line – the operations team (the business, Privacy, IT, Marketing, et al.) – follow those guidelines. It also provides the third line of defense – internal or external audit – with a framework necessary to independently assess privacy compliance and risks.

Privacy assurance is the 2nd line of defense after privacy operations and before internal/external audits.

For example, a privacy assurance function would be needed to inform the business that they have to respond to EU data subject requests (DSARs) within the one-month time period prescribed by regulation. The business would then be responsible for designing that process in order to respond in a timely manner. Internal audit would then run an independent audit, making sure that the business was responding appropriately. Another example might be managing the requirement for the consumers opt-out of the sale of personal information mechanism.

As we look to the second line of defense you can readily see that collaboration is extremely important as it helps establish roles and responsibilities through a privacy framework and the accountability for each process.

As an additional benefit the privacy assurance component ostensibly functions as a training and communication platform that really builds and promotes privacy by design throughout privacy operations.

—Eric Paulson

The pillars of privacy assurance

Crucial to the establishment of the control framework are well-defined roles and responsibilities; a compliance monitoring function; and established key risk indicators to monitor and measure those privacy risks and obligations.

The three pillars of privacy assurance are privacy control framework, compliance monitoring, and establishing key risk indicators (KRIs).

While a privacy control framework can leverage industry frameworks such as NIST or ISO, we’ve also seen organizations create their own custom frameworks. There is a lot of flexibility and irrespective of approach, a framework will help an organization, not only understand their compliance requirements but also rationalize those requirements by defining objectives and thinking about control matrixes that align with an organization’s obligations and risk appetite.

Importantly, providing a control framework while allowing the business owners to define the processes that work best for them enables business units to take responsibility and understand how to meet privacy compliance objectives and be accountable to the privacy controls.

—Eric Paulson

The second pillar, compliance monitoring, concerns the steps a company needs to take to comply with privacy and continuously evaluate its effectiveness to ensure that the control objective is being met. This can be done through compliance self-assessments. Continuous evaluation is particularly important given that the regulatory landscape is constantly changing with emerging regulation and changing interpretations.

Key risk indicators (KRIs) are the metrics that provide the privacy team the opportunity to continually monitor compliance, assess risk, and evaluate opportunities for improvement.

Example KRIs we typically see include:

  • number of data subject requests;
  • number of days outstanding;
  • data inventory validation; and
  • the number of systems in the data inventory.

KRIs help the assurance teams verify that privacy activities are being completed timely and understand trends (e.g., DSARs trending up). This supports assessing root causes and implementing improvements before there are significant negative impacts.

Establishing KRIs is an effective way to communicate program requirements to leadership such as approvals for budgets, the need for additional resources, or technology investments.

—Eric Paulson

The steps to establishing a privacy framework

Establishing a framework is the foundation of a privacy assurance program. It provides a baseline to manage and mitigate risks and helps to make sure that individuals throughout the organization understand their roles and responsibilities in doing so. The key question, then is how do you determine the right framework?

Establishing a privacy framework: determine approach, map controls, ID roles/responsibilities, ID risks, & mitigate risks.

Step 1: Determine your approach

As noted above, we’ve seen companies leverage one of the leading frameworks and customize it to fit their business. Whichever framework you choose, it is important to define the domains across the privacy area, as well as subdomains and control activities.

You also want to strike the right balance. The framework should be manageable but also detailed enough to map to the underlying regulatory requirements and demonstrate accountability.

—Lindsey Hohler

Step 2: Mapping controls

Once the framework and baseline have been determined, the work of mapping the controls begins. It is here where a lot of the customization happens to achieve alignment with regulatory requirements; controls language; testing procedures; the data types and data subjects; and defining risk and the opportunity to mitigate those risks, for example.

Importantly, by using a one-to-many approach you can have a simplified set of activities to manage. The goal is to have a clearly defined set of controls that can address all of the organization’s privacy obligations.

Step 3: Identifying Roles & Responsibilities

Once all the controls have been identified and mapped, the next step is to map ownership. Some of the controls may sit with the privacy team, but many of the activities will actually be performed by the business.

Identifying roles and responsibilities help promote privacy-by-design and ensure that individuals understand their responsibility as they relate to privacy. Here, it is important to highlight the criticality of strong executive – and middle management – sponsorship.

—Lindsey Hohler

This, as many know, is one of those activities that’s much easier said than done and takes quite a bit of work to actually embed these activities within the underlying organizations.

Step 4: Identify risks

The next step is to identify what is in place today and think toward the future: to work towards mitigating risk going forward. The high-risk functions need to be identified and communicated to the business.

Also, risk levels should be assigned with each underlying control to feed reporting, escalations, and determine the remediation efforts that are needed. This also provides insights into privacy operations to help understand issues that may be impacting risks. This should map to regulator expectations as well.

Properly documented controls and roles and responsibilities will allow you to shift priorities very quickly.

Step 5: Supporting risk mitigation

The last step in developing a robust controls framework is to support risk mitigation – a key outcome. This involves monitoring control implementation, performing ongoing assessments and reviews, as well as using KRIs to help monitor and measure privacy operations.

Having defined the controls and identified the control owners, the privacy team is more ready to collaborate with, and support, the business and risk mitigation activities.

Once a risk is identified the control owner can speak to the cause and from there, the privacy team can work to understand that risk, develop a plan, and help leadership understand the issues and needed remediations (or to determine that the business is comfortable accepting the risk).

Ultimately,  a privacy compliance framework will help build a stronger understanding of the organization’s overall privacy obligations and define the accountability needed to operationalize these activities.

—Lindsey Hohler

Privacy control frameworks in practice: a case study

The Challenge: A financial company had many privacy operations owned and managed outside the privacy team by business units across the organization. Adding to the complexity, multiple regulations were industry-specific and geo-specific.

The Goal: Assess compliance across a wide range of data privacy regulations (15+) and identify and remediate gaps in compliance across services, businesses, and operations.

One of the challenges Grant Thornton faced was identifying control owners. Early on it became clear that individuals identified as control owners were not operationally responsible. This required identifying both who were performing these activities and who owned those controls.

With control activities and owners identified it was possible to then implement a recurring process so that the control owners could regularly validate the activities, note any changes to the process, and leverage the framework to support audit committee and board reporting.

Identify the framework to leverage, the domains/subdomains to align with requirements, and map controls to each requirement.

This resulted in improved privacy operations with an enhanced ability to monitor compliance:

  • the ability to guide control owners where risk is identified;
  • greater visibility to the DPO through KRI evaluation;
  • the ability to define short-, mid-, and long-term goals to improve compliance;
  • the ability to identify high-risk data processing and need for data inventory revalidation;
  • the ability to more quickly map new regulations and incorporate those into privacy operations and controls; and
  • the ability to establish guidance at the business unit level.

Privacy assurance outcomes: reinforced roles, compliance monitoring, strong privacy operations, less legal/compliance risks.

Given the complexity of managing privacy for regulatory compliance, let alone attaining privacy-by-design, it is not a matter of dispute that robust operational capabilities are required. This is as true for smaller brands and publishers as it is for the largest.

Effective operations are simply not feasible without establishment of privacy principles, carefully considered frameworks to implement them, and investment in the technologies and processes to support them. Privacy cannot simply be bolted on.

To learn more about operationalizing privacy from some of the world’s leading practitioners, we suggest:

  • Privacy Law Update

Privacy Law Update: January 18, 2022

Stay up to date with this weekly release covering key developments on data privacy laws, technology, and other hot privacy topics!


Newsworthy Updates

Privacy Impact Assessments – Practical Considerations

This is the first of a multi-article series focused on privacy impact assessments. This first article provides an overview of privacy impact assessments, the existing and pending privacy laws which require privacy impact assessments, and how privacy impact assessments are used in practice from a proactive perspective. The second article will focus on data protection impact assessments pursuant to Article 35 of the European Union’s General Data Protection Regulation (GDPR). The third article will focus on similar assessments required under U.S. State laws set to go live in 2023 including the California Privacy Rights Act (CPRA), Virginia’s Consumer Data Protection Act (VCDPA), and the Colorado Privacy Right Act (CPA). The fourth and final article will provide best practices on building a global privacy impact assessment process.

CNIL’s ePrivacy Fines Reveal Potential Enforcement Trend

The new year for EU data protection enforcement has rung in with an early bang courtesy of France’s data protection authority, the Commission nationale de l’informatique et des libertés. The CNIL fined Google and Facebook up to a combined 210 million euros for alleged cookie violations under the ePrivacy Directive. Allegations against the companies focus on French users’ inability to easily decline tracking via cookies. Google’s U.S. and Irish operations received penalties of up to 90 and 60 million euros, respectively, while Facebook Ireland will pay up to 60 million euros. Additional daily penalties of 100,000 euros are possible if users are not given sufficient means to opt out of tracking within three months.

Israel Privacy Protection Bill Includes Steep Sanctions – and a DPO

On January 6, 2022, the Israeli government released a long anticipated bill amending and updating Israel’s 1981 Privacy Protection Act (PPA) (the Bill). If passed, the Bill would constitute the most comprehensive update of the PPA in more than two decades. Primarily, the Bill greatly enhances the enforcement and investigation powers of the privacy regulator, the Israel Privacy Protection Authority (IPPA). While relaxing certain bureaucratic burdens on Israeli companies, most notably the dated obligation to register database, it would tighten substantive obligations and impose steep sanctions for violations, including severe criminal penalties. For the first time under Israeli law, it would require certain companies to appoint a data protection officer (DPO).

How to Read Your iOS 15 App Privacy Report

IT’S BROADLY SAFE to download a mainstream app from the iOS App Store or Android’s Google Play. But thanks to increasingly invasive tracking by Facebook and others, Apple and Google have both recently introduced transparency features into iOS and Android that give you more insight into how often apps access data and sensors, from your camera and microphone to your location and contacts. If you’re an iOS user, the App Privacy Report tool likely hit your phone a few weeks ago. Here’s how to get the most out of it.

India’s Draft Data Protection Bill Moves Closer to Passage

Stephen Mathias from Kochhar & Co. reports that on December 16, 2021, the Indian Joint Parliamentary Committee (the “JPC”) submitted its report on India’s draft Data Protection Bill (the “Bill”). The Bill is now likely to be passed by Parliament in its next session, beginning in February 2022, and likely will enter into force in the first half of 2022. In its report, the JPC recommended a phased approach to implementing the law, beginning with the appointment of various government officers, such as the Data Protection Authority (“DPA”), with full implementation of the law to be completed within 24 months. The JPC’s report also contained a revised draft of the Bill. Certain key aspects of the revised Bill are summarized below.

What to Anticipate for EU Digital Policy in 2022

Digitalization has been a top priority for the European Commission, including an extensive legislative package, which ranges from platform regulation to artificial intelligence, from data sharing to cybersecurity. With the commission’s mandate set to expire in spring 2024, this year is its last chance to present proposals that could complete the legislative process in time. Journalist Luca Bertuzzi looks at what 2022 has in store for digital matters in the EU.

Privacy Legislation

Alaska: The Consumer Data Privacy Act (HB 159 / SB 116), a CCPA-style bill, will roll-over into this session. The Act was introduced at the request of Governor Dunleavey. In December 2021 the House Labor and Commerce Committee presented changes to the bill, summarized here.

Furthermore, the “Personal Information Protection Act” (HB 222) was prefiled on January 7 by Representative Rauscher (R). The Act contains CPRA-style “do not sell” and “limit the use of sensitive personal information” rights. The Act creates a private cause of action limited to data breaches and delegates rulemaking authority to the Department of Commerce, Community, and Economic Development.

District of Columbia: Council Chairman Mendelson introduced B24-0451. The bill was introduced at the request of the Uniform Law Commission (ULC) and is based on the Uniform Personal Data Protection Act drafted by the ULC.

Florida: Senator Jennifer Bradley filed the Florida Privacy Protection Act (SB 1864) on January 7, 2022. Senator Bradley sponsored SB 1734 last year. It is expected that Representative McFarland will introduce a bill in the Florida House in the coming days.

Rep. McFarland (R) introduced HB 969 on January 11 which provides a consumer right to opt-out sale or sharing to third parties and a private right of action limited to (1) failures to delete/correct, (2) continuing to sell or share data following an opt out, (3) selling or sharing information of people under 17 without consent. As a reminder, Rep. McFarland’s 2021 “Consumer Privacy Act” (HB 969) included a broad private right of action and passed the state house by a 118-1 vote.

Indiana: HB 1261 was introduced on January 10 by Rep Carey Hamilton (D). The bill creates CPRA style rights to opt out of the sale or sharing of personal information and to restrict the use of sensitive personal information. Enforcement authority is allocated to the Indiana Attorney General’s Consumer Protection Division.

Maryland: The “Maryland Online Consumer Protection and Child Safety Act” (SB 11) was pre-filed on Oct. 15, 2021 by Senator Lee (D). The bill creates a right to opt out of the third-party disclosure of personal information and provides for broad AG rulemaking.

New York: In New York, a platoon of comprehensive data privacy bills originally introduced in 2021 rolled over on January 5. These include, the “Online Consumer Protection Act” (A 405); “Consumer Control of Personal Information” (S 557); “New York Data Protection Act” (S 1570); “Digital Fairness Act” (A 6042); and perhaps most prominently, Senator Thomas’ “New York Privacy Act” (S 6701) which would provide for a “duty of loyalty.”

Ohio: The “Ohio Personal Privacy Act” (HB 376) supported by Lt. Governor Jon Husted will carry over. The bill resembles elements of the CCPA and includes a safe harbor against AG enforcement for adherence to the NIST privacy framework. At a House Government Oversight Committee Hearing on December 9th, sponsors suggested that numerous changes to the bill are under consideration; however, an amended version has not yet been formally released. News reports suggest that sponsors have put the bill on hold, but intend to push it again this year.

Oklahoma: The “Oklahoma Computer Data Privacy Act” (HB 2969) was prefiled by Reps. Walke (D) and West (R) on Sept. 9 201. The Act provides that a business shall collect or share data only if “reasonably necessary to provide a good or service to a consumer who has requested the same or is reasonably necessary for security purposes or fraud detection.” In 2021, an earlier version of the Act passed the Oklahoma State House by a vote of 85-11.

Vermont: H 570 a placeholder bill (no substantive text) relating to enhancing data privacy protections for consumers was submitted on January 10 by Reps. Marcotte (R) and Kimbell (D).

Virginia: Virginia legislators have started to propose amendments to the “Virginia Consumer Data Protection Act” scheduled to take effect in 2023. HB 381 introduced on Jan 11 by Del. Davis (R) would allow controllers that collect consumer data indirectly to treat deletion requests as opt out requests. HB 552 introduced on Jan 11 by Del. O’Quinn (R) would add 501(c)(4) organizations to the nonprofit exemption. HB 714 introduced on Jan 11 by original VCDPA sponsor Del. Hayes would add “political organizations” to the nonprofit exemption; allow an opportunity to cure only where “deemed possible” by the AG; permit the AG to recover “actual damages” sustained by consumers; and replace the “Consumer Privacy Fund” with the existing “Revolving Trust Fund.”

Washington: Representatives Vandana Slatter and April Berg pre-filed the Washington Foundational Data Privacy Act (HB 1850) on January 7, 2022. The bill is similar to the Colorado and Virginia laws, but it contains an annual registration requirement, would create the Washington State Consumer Data Privacy Commission (similar to the California Privacy Protection Agency), and contains a private right of action.

It remains to be seen whether Senator Carlyle will amend his Washington Privacy Act (SB 5062) when the legislature opens on January 10, 2022.

On January 10, Sen. Carlyle introduced a limited-in-scope privacy bill, SB 5813, which would impose broad privacy obligations for children and adolescent data, regulate data brokers, and provide for the recognition of a “Do Not Track” mechanism. A first hearing is scheduled for January 20th. Sen. Carlyle’s “Washington Privacy Act” has also been carried over.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo
  • Privacy

Redefining the Marketer’s Relationship with Data in the Privacy Age

During the December 2021 Spokes Technology Privacy Conference which brings together leading evangelists, practitioners, technologists, legal experts, and policymakers, a long-simmering (and impossible to ignore) question was asked: What now is marketing’s relationship with data?

Digital Ad spend in the U.S. is $153B ($455B worldwide). U.S. spend alone is projected to reach $460B by 2024. Todd Ruback, Director U.S. Privacy at Promontory (an IBM Company) notes that more than 20M people work in digital marketing worldwide. For the adtech/martech ecosystem this is no small question.

Ruback, Anne Cheung, the Senior Global Privacy Counsel at Interpublic Group (IPG), and WireWheel Senior Product Manager Jeff Wheeler discussed redefining the marketer’s relationship with data in the privacy age. The session was moderated by WireWheel CMO, Camille Landau.

We really need to have a privacy reset

We’re at a point where our former view, which was a strict compliance lens to privacy, is going to have to change. It’s not just what the law says and what we have to do.

It’s a combination of that with what should we do.

—Ann Cheung, IPG

Noting a 20-year career in privacy, Ruback says he has never “before seen the coming together of so many elements creating a perfect storm.

“From a privacy professional’s perspective this has always been important, but now, within the context of digital marketing and adtech at the intersection of privacy, we are seeing an absolute flurry of regulatory activity; enforcement actions; guidance sending industry signals; class-action suits; and a whole body of emerging data protection laws. None of which are prescriptive, but all of which say, if you are going to engage in online tracking for the purposes of digital advertising, Then you just need to do it right…it needs to be thought through.”

Ultimately, says Wheeler, despite the many failed attempts at state privacy laws we have seen, “I think it shows that we’ve moved, especially in the adtech industry away from self-regulation.”

Given this, what should digital marketers be thinking about? “Do I apply different rules in California and Colorado and Virginia versus New York, or do I apply a principals-based [approach]: do what the law requires but also have [a set of principles] that drive the way I operate?” asks Cheung.

As these laws aren’t prescriptive and give rise “to very different interpretations of the same exact provision of the law…” she suggests focusing on the fundamentals. “Those principles that guide how you operate. We’re at a point where we really need to sort of have a privacy reset.”

Top-of-mind concerns for digital marketers

Cheung notes that some IPG clients are “very privacy-focused, have a first-party relationship with their customers, and want to preserve trust. There are also organizations that, whether they’re regulated or non-regulated entities, are moving away from just a privacy compliance approach.

“The types of questions they have are ‘what are your agencies doing to get ready for what’s going to come in 2023? How do I reconcile that?’”

Ruback, noting the rising number of privacy laws, crackdowns, and policy shifts by big platforms like Google’s sandbox initiative and Apple’s more to an opt-in approach, notes it hits directly at the way the adtech/martech ecosystem is architected and monetizes.

CPOs and GCs are coming to us all the time and asking what are our regulatory compliance obligations so we can comply….And by the way, we don’t want to do anything to endanger the golden goose. Our company derives a lot of money from digital advertising: specifically behavioral advertising. ‘So how do we do it the right way and not only preserve that revenue but grow the revenue?’

—Todd Ruback, Promontory

“In Europe, they’re looking beyond cookie banners or what’s the bare minimum,” says Wheeler. They’re focused on  “the better experiences they can achieve. It’s blatantly obvious to most that the end consumer really dislikes the cookie banner. It’s a bad experience and UX designers hate it.”

“So, especially in Europe, we get a lot of questions about the better choices and decisions that can be made with those types of experiences….To show my user that I’m trying to be more thoughtful.”

Citing the need to shift away from the opaque adtech architecture towards good privacy governance to both preserve and grow the revenue from digital advertising, Ruback says “It all starts with rethinking data strategy.”

“Part of that is looking at the digital supply chain to get clarity as to what’s going on in this opaque ecosystem. How do we know what third parties are on our websites and what they’re collecting? Why are they using it? Maybe they’re authorized, but a vast number of them are unknown to the website operator: How do we control that?”

“This shift is taking place within organizations on a cross-functional basis with CMOs, tech, tech OPS, privacy, legal, and a few other functions. And what we will be seeing is the semblance of best governance practices emerging in this space.”

Positioning for the future

“So many are trying to apply privacy requirements into the existing infrastructure. Understanding what that infrastructure is doing and identifying potential partners that can help elevate the way you govern [is essential]” highlights Cheung. “Making sure that their systems can do that is going to be very important for organizations. And that requires funding and time.”

The business side has to do that. The marketing operations, engineering, and IT teams have to be involved in those discussions. There needs to be a sponsor on the business side that’s championing change within organizations. That’s going to drive the processes and operational procedures that are necessary to carry out the compliance: and not just what I have to do, but what else should I do.

You have to have some sort of framework around how you’re going to apply this.

—Anne Cheung, IPG

The critical starting point to all this says Cheung is data inventorying: “How did the data come into your organization? How do you use it? Do you use it? When will it be deleted? Who is it shared with?”

Importantly, “It’s not the legal team that needs to know those answers. It’s our marketers that need to know” and be able to  “articulate to consumers how that information is used.”

What questions should organizations ask themselves

“It goes back to this idea of digital governance and knowing who you’re working with,” says Wheeler. More often than not we find that organizations do not know everyone that’s on their site.

When you take enterprise customers that have websites that are being touched by multiple organizations and various groups within their own company – adding and removing tags and scripts – more often than not, something gets forgotten, added, or piggybacked.

And this creates a lot of uncertainty. For example, the CCPA introduced “do not sell,” Do you also know that you’re sharing this data with these third parties, or others within the same company? Many didn’t.

—Jeff Wheeler

“Plan for what you don’t know,” offers Wheeler.

Ruback lays out some critical requirements:

  1. The ability to articulate [the requirements from different laws] and translate that into what you need from a budget and resource perspective in order to operationalize compliance;
  2. A shift towards more rigorous data governance;
  3. Understanding that data is also risk and the consequent requirement to deploy controls; and
  4. Recognizing that consumers, regardless of the data protection laws, are being empowered with control over their data.

“Technology is a huge piece of this,” says Ruback, “whether it’s through WireWheel’s platform or other providers. But we’re also seeing marketing and other groups starting to look at other data types that they can use to preserve and grow their digital advertising revenue, such as zero-party data.

“But again, it starts with a recognition that we have a regulatory obligation to empower the individual with control over their data [and asking] how do we do that?”

How close are we to achieving all this?

What began as a reaction to regulatory edict has for many become a proactive approach to privacy. Mere compliance is being replaced by efforts to introduce privacy principlesframeworksengineeringoperations, and governance. How close are we to that desired state?

“I’m very bullish on the industry, opines Ruback. “The industry has always shown that it is resilient and innovative. It’s too much of a powerful economic engine. And it’s too important to not get right.”

What I see peering over their horizon is an evolution where companies will evolve their data strategy and optimize the data that they collect to optimize their digital advertising spend while at the same time reaching out to their customers and saying we want to do it the right way….You’re the boss. You exercise control.

—Todd Ruback, Promontory

So how close are we?

“It really depends on what the organization puts into this,” offers Cheung. It requires “investment in technology that can scale to make it happen.

“The question is, are the technology solutions coming out going to actually address that? Are we going to be able to push that preference through all the downstream partners that data currently travel to? How can we, in an easy way, talk about this ecosystem without scaring consumers, be fair to them, and talk to them about how this actually works?

I think it’s a long road ahead, but I also think that the conversation is robust at the moment. Some organizations may not [technologically] be ready for that…[but] should look at how to be incrementally better and then continue to progress on that path.

There needs to be work that goes into this and the digital advertising ecosystem is way too fast to be able to say that, yes, this technology, this one solution, is going to solve for that.

I don’t think that’s a reality we’re at yet.

—Anne Cheung, IPG

Learn from marketing and privacy experts how changing privacy laws in CA, VA and CO are causing marketers to rethink how they get permission to use personal data and how they manage it.

  • Privacy

What Behavioral Economics Can Teach Privacy Officers

A common refrain heard at WireWheel from its customers is the challenge privacy officers have socializing privacy risk and embedding it into organizational processes. Privacy risks, while opaque and obscure to many people, are quite tangible to an organization handling consumer data in terms of regulatory compliance, information security, product design, and consumer trust.

Behavioral economics has studied how people assess risks, and seemingly simply ignore them. Perhaps privacy officers can look to behavioral economics to uncover ways to more effectively nudge people in the organization to take the steps necessary to avoid potential privacy threats and gain that often-elusive buy-in.

WireWheel’s VP, Product Marketing, Judy Gordon met with behavioral sciences expert Michael Hallsworth, Ph.D. to illicit his insights at the Fall Spokes Privacy Technology Conference. Hallsworth is Managing Director, Americas, The Behavioural Insights Team. He previously held positions at Columbia University, Imperial College London and is the author of Behavioral Insights (MIT Press, 2020). You can view their conversation “What Behavioral Economics Can Teach Privacy Officers” here.[1]

Mental shortcuts

The core insight here is that our behavior tends to be more unthinking and in response to our immediate context. We tend to assume that what we’re doing to make decisions is weighing costs and benefits, paying attention to all the available information, and then making a considered decision on the basis of that information.

While people will weigh cost and benefit, Hallsworth notes that this is effortful. It takes time and we often simply don’t like doing it. And importantly, in many circumstances, that process is simply too slow to be useful, so we deploy mental shortcuts which “operate quickly, intuitively, and without our awareness.”

One example: Do what other people are doing. “And this mental shortcut [or heuristic] in response to our environment is necessary to navigate the world. We couldn’t be weighing up all the decisions we make all the time. So, we have habits. We have a usual way of dealing with situations.” But sometimes that kind of mental shortcuts can also create problems for us.

Ultimately, behavioral science attempts to “understand how we really make decisions in context and try to design products, policies, or services to take that into account.”

What information gets through our filters?

“One thing we know is really important is this idea of sales,” says Hallsworth. “What information do people pay attention to? And the way we allocate our attention follows some predictable paths.”

We need to understand how we filter information. We’re bombarded by information all the time in our day-to-day lives and certain things get through those barriers because they attract our attention.

And those things are generally quite salient. They’re easy to understand and they create a vivid picture in our minds.

Consider, for example, how much television news time is allocated to different kinds of natural disasters.

What was found is that “things like famines and droughts, which are based on things not happening, require thousands of times more deaths to get the same attention as one death from a volcano,” which is spectacular and draws attention. It gets through the kinds of filters we put up.

“The same thing happens with risk perceptions.” Think here of our response to – the salience of – the idea of plane crashes versus road travel which is far riskier. And “salience can also be about how we see what others are doing.”

The salience of risk and information aversion

We talk about social norms quite a lot as we often infer what the right thing to do is from others. Consequently, one very obvious thing to look at is what are other people doing. And if no one else has any kind of risk mitigations that are visible, well, we should be okay.

Although we know in theory there’s a risk, if no one else is doing anything I’m probably okay.

“Information aversion” is another factor. Even though we understand that there is a risk, it makes us uncomfortable, so we “set it aside.” While “one way of resolving that discomfort is to actually take mitigation procedures. An easier way might be to explain it away. To rationalize it and avoid information that brings it to mind.”

“What’s the point just getting worried about it? You know, it’s safer, easier, and more pleasant, not to know, really.” This is just a couple of ways that people think about risk and don’t always take mitigation procedures that lead to practical good outcomes.

Privacy risk and the challenge of salience

But companies need to be practical. “Information aversion” will likely not fly as a rationale in response to regulators in the event of compliance breaches. And Gordon pointedly asks, “what it is that most leaders and organizations may be doing wrong when they try to get people or their companies to protect themselves against privacy risk?”

“It’s a really difficult challenge,” acknowledges Hallsworth.

Part of the problem is one of salience. Privacy online, or privacy issues in general, can be quite abstract [and] theoretical rather than having an emotional reaction that may lead to action.

The really difficult thing here is trying to get a balance. To get these kinds of ideas salient to prompt action – to make it real for people – without creating an unproductive kind of anxiety. Because what can happen is the effort goes into managing the anxiety – the information aversion – and fear becomes the result.

“Instead, what you really need to do is to create a need for action, but connected to practical, specific actions,” to manage the potential anxiety. “We’re not that good at connecting the general idea to the specific actions that people can take. Identifying those actions, putting them in the right order, and then communicating them at the right time in ways that people can work on.”

One underlying problem here might be, if you’re say a Chief Privacy Officer, then it is of course very real to you. You’re very in detail. And we know from behavioral science that if you’re really familiar with something you tend to think that others are more familiar and see it as important as you do: The ‘illusion of similarity.’

Organizational privacy as heuristic

“Behavior is strongly shaped by environment and immediate context. This is the key insight from a lot of the behavioral science literature. This means that even if you know the right thing to do…it’s that practical constraint of what it’s like to work in an organization that will actually be what determines behavior more powerfully than beliefs or intentions.”

So, what you really have to do, then, is think about how to build the privacy actions into institutional processes.

The way that you can ensure a behavior endures is to make it not effortful. You want it to become habitual, And the way to make it habitual is to build it into institutional processes.

Privacy professionals seeking to systematize privacy would benefit from leveraging heuristics: “thinking about how to help people create their own mental shortcuts.” Perhaps, offers Hallsworth rather than creating big lists of complex things people have to do that compete with their daily challenges, look at establishing a few general rules of action. This may be more effective than a complex list of things “that people have to bear in mind and consciously remember.

Why consumer actions are at odds with their privacy values

While acknowledging “the jury’s still out,” Hallsworth offers that “intentions and views don’t always play out in practice because of the contextual factors in the moment.”

Again, the risk becomes abstract, and the immediate gain is in front of you. It’s just easier to go ahead. The mental barriers we have about why we shouldn’t do that? We are pretty good at finding ways of explaining those away in the moment if we want to. We also may be overly optimistic about what the consequences may be. These are the kind of ways we rationalize.

“The question then becomes, how much of a prompt would you need in the moment to [alter that behavior] and make people step away? And we know it seems that putting up prompts doesn’t really make much of a difference because people just ignore them. So, the real question? What would make a difference in that moment, because in that moment, the concerns are abstract, and the gain is quite concrete.”

Is the Data Subject Access Request a dark pattern?!

Gordon asks a very interesting and important question. “Is this idea that I have to go and make a request about my information counter to what a behavioral scientist would say is the best way to manage your privacy?”

If you were to design a process for people not to do, it would be something like that.

It’s like when companies don’t want you to cancel as a subscription and you have to go through numerous [often difficult] steps.

“Also, I think the gain from [exercising your DSAR rights] is quite unclear, because you don’t actually know what the data is. If you knew that there was something important or meaningful in that data, you might be more motivated to find it out, but if it’s only the possibility of something important, that again takes away of motivation. People are unlikely to go through many steps for unclear benefit.”

And nudging?

“A nudge is the idea that people have a set of choices, and the choices are presented in such a way that given the mental shortcuts people use, it is more likely the choice is made for the option that benefits [the person making it.]”

As an example, Hallsworth points to Apple’s “nudge,” with the change in privacy default settings from your data will be shared to requiring the user to affirm – make an effort to share – their data. Something he says (and the adtech world knows) has a very big impact on behavior.

One of the cool things about nudges, although you set it up so that people are more likely to choose option ‘A’ as opposed to option ‘B’ or ‘C,’ is they still can choose. They still have freedom of choice, if they want to override your nudge to take option ‘A.’

And that’s why it’s different from regulation, where we have to do some things and preempt that freedom of choice.

“This is one subset of how you can apply behavioral science, but it’s a powerful way of doing it, and that’s why it has gotten a lot of attention.

“This is why you need to think carefully about regulation. It’s not just requiring people [and organizations] to do certain things, it’s how. I don’t want to say that you should go and regulate that [organizations] have to disclose information a particular way: it can also backfire.” But an incentive for “a market or system so companies are incentivized to do ‘the right thing’.”

  • Privacy

Practical Steps to Implementing Privacy Engineering

A central remit of privacy-by-design is to dive deeper into the tools, methodologies, and techniques that ensure that your company can integrate privacy into its design processes and workflows. It is a rapidly growing field that is often called privacy engineering. The challenge becomes defining privacy engineering precisely and then making sure that that function is properly supported within the organization.

The emergence of privacy engineering as a discipline signal that privacy-by-design has moved rather quickly from the conceptual to the nonabstract environment that is product engineering.

To discuss the challenges attendant to implementing privacy engineering and dig into some practicable advice, WireWheel Co-Founder and Chief Scientist, Amol Deshpande moderated a panel at the Fall Spokes Privacy Technology Conference held this past December 7-8.

Deshpande was joined by the former Director of Engineering Facebook (now Meta) Yi Huang; Anna Togia, the Technical Product Manager Privacy, Security & Compliance at dating app Hinge; and Uber’s Head of Privacy Engineering and Assurance Nishant Bhajaria to discuss Practical Steps to Implementing Privacy Engineering.

Privacy engineering is a multifaceted challenge

We heard yesterday [during the Spokes Privacy Technology Conference] that there are new frameworks coming down the pipeline and the privacy landscape is moving at a very rapid and aggressive pace.

It’s really understanding, what are our terms, how are we defining privacy engineering, and who really has a stake, that’s going to help us solve this problem.

—Anna Togia, Hinge

“A core problem is noise in the privacy space,” says Yi Huang. And companies really need to be laser-focused on the most important problems.

“At any given time, there are multiple privacy regulations that companies have to deal with – each with different timelines and guidance. So, we ask ourselves: Instead of just simply being compliant with each one as they come, how can we respond to the regulations more effectively? What technology can we put into the products we are building…But practically, it’s very difficult because you have to juggle multiple balls.”

For a small company like Hinge, a central challenge is “how do we define privacy engineering in an organization that doesn’t have the resources to field a privacy engineering team per se,” says Togia. “What then does privacy engineering look like for us? Are we looking at technical know-how, process management, or a combination of both? And if that’s the case, then where should privacy engineering sit?”

Bhajaria is concerned that what has served so well in the past, may not be well-suited to the challenges of privacy engineering:

“There was a scene in the movie Titanic,” relates Bhajaria. “They were looking back and saying, ‘how did this captain not see the iceberg coming. It was out there in the open.’ The response: ‘Everything he knew was wrong. ’ I hope that Titanic analogy is limited when it comes to privacy because that didn’t end particularly well. But there is a larger point here.

“The things that have made us very successful as a tech industry may not automatically transfer on the privacy security side” he opines. “Having siloed teams, bespoke processes, independent tech stacks, allowing complete autonomy to engineers. Those are wonderful things. They led to the creation of great services.”

But with privacy, security, and data protection in general, you need to understand end-to-end how the data flows…across multiple automated systems without any clear visibility. We are literally sailing into the night and running towards an iceberg at times, not knowing that the iceberg is there.

The biggest challenge for all of us is going to be, how do we use the best of our tech impulses and how do we repurpose our instincts, our data, our processes, and our tooling, to ensure that we don’t double down on the things that will not work for privacy and security purposes.

—Nishant Bhajaria, Uber

Building on the theme of “talent profile,” Togia says Hinge is focused on “who should you hire into these roles when you’re thinking about the privacy engineering space.” As “a small fish in a big privacy ecosystem pond, it’s really been a concern. Who are the right people to solve some of these major privacy concerns?”

She notes that while it is really important to be an expert at your core role, the “technical know-how,” the engineer needs an understanding of the impact that privacy is going to have. “Those are the people that you want to go for because those are the people that can also educate the rest of the team…And then you start to really cultivate privacy by design.”

Bhajaria points to the problems of scale and the complexity of growing data. “My advice from a framework perspective would be to try and identify what you have at a very early stage” when you can “harness actual privacy controls. And this too is “very important when it comes to staffing,” who will ultimately create the needed synergies necessary to privacy-by-design.

It’s critical to start with internal teams. Even if you ended up buying third-party tools, even if you hire consultants on the outside, it’s important that your engineers are able to invite those external interventions into their system. And they cannot do that unless they have end-to-end visibility.

So, I would strongly recommend training internal engineers…it is critical that you have the synergy between third-party tools, outside vendors, regulators, and internal engineers because that’s where the culture shift truly happens.

—Nishant Bhajaria, Uber

Integrating the privacy review process into engineering

“Integrating the privacy review process into the engineering release process is necessary, but not sufficient,” says Huang. Rather “we have multiple integration points involving privacy in the development process:

The heavy lifting really happens in the beginning. The earlier you involve the privacy discussions, the better. The first major discussion about privacy is in the product design phase. Before engineering actually puts in any code. We discuss how the specific product is being built, what data the product is using, and what are the privacy implications.

—Yi Huang, Facebook

“At the implementation stage…we focus on whether we’re doing what we said we were going to do… and put checkpoints in place. After it’s finished – before the release process – we have the privacy engineering review process [followed by checks] in the release cycle.”

“When it comes to policy reviews,” Bhajaria notes that the legal team is often put “in an impossible situation of “letting something go…or being seen as a blocker. And that creates unnecessary friction.

We too “intervene at the inception stage before the PRD [product requirements document] becomes and ERD [engineering requirements document].”

Much of it can be automated:

If you find out there are teams collecting a lot of data, but only need that data for a couple of days, you can write a deletion service that will query some of their databases on the third or fourth day…without that data making it to the warehouse…You can fix the problem right off the bat…investing in automation, but in a very targeted fashion.

—Nishant Bhajaria, Uber

Typically, “you start with the first principles. And then, and then struggle to produce implementation details,” continues Bhajaria. “Let’s flip the pyramid a little bit. Let’s get the implementation details and use those learnings to inform the piracy principles and their evangelism across the company to unblock the piracy review process rather than having [it at] the tail end of the process.”

Justifying all this effort

“Figuring out trends over time really justifies the value of privacy,” opines Togia.

Cataloging things like frequency and severity of privacy and incidents is a great place to start. How you quantify those privacy incidents and whether they’re plausible or identifiable,. Looking at the types of privacy failures your system currently has and how much you have fixed over time.

These are baseline metrics that you can then take to decision-makers and leaders in your organization.

—Anna Togia, Hinge

“If we are using data in the right way and know how we are collecting and capturing it, and how it flows throughout its life cycle,” says Togia, it is really a benefit to anyone using data. “I think that you can often justify privacy as more than just what it is at face value, and really leverage it.”

“I want to get away from us, arguing from a defensive, punitive position where if you don’t do this, bad things will happen,” asserts Bhajaria.

Think of it this way. If you have a lot of data in your warehouse (more data than you need) you are probably protecting it, right?” But “you are protecting data that you probably shouldn’t have. It’s not adding value to the business, but it’s a security risk. It’s a business risk. It’s an unnecessary expenditure.”

Importantly, Bhajaria also notes the failure to comprehend the cost of bad data that works its way through the organization that is then converted into decisions.

My advice to people would be, don’t just position it as a privacy expense…Make sure that this is seen as a sustained strategic partnership between you and the data science team, the marketing team, the security team, and the platform team.

I cannot say this enough: don’t make it just about privacy.

—Nishant Bhajaria, Uber

Some concluding thoughts

Yi Huang, formerly of Facebook

I put privacy tools into four different buckets:

  1. Introspection: understanding how your systems work.
  2. Detection: the ability to identify when violations happen
  3. Enforcement: the ability to block issues from arising in the first place, and
  4. Verification: the ability to prove that you are compliant (e.g., ROPAs and PIAs)

“I think the, one of the reasons why it’s really hard for us to respond to privacy regulations is because we really don’t understand how data really flows [in many cases] through systems.”

Nishant Bhajaria, Uber

“I’m sure there’s an engineer somewhere out there right now, assuring their manager or attorney everything has been deleted, or there is no known risk. And the attorney is going to wake up tomorrow morning and realize that that wasn’t true. Not because the engineer was dissembling, but because they didn’t know somebody else was doing something.”

Anna Togia, Hinge

“The important thing to think about, especially anyone that’s maybe on a smaller team, is that that doesn’t make privacy any less vital.

“It is a matter of ubiquity. This needs to be a pervasive concern and we need to be asking ourselves how do we organizationally affect that.”

  • Marketing
  • Regulations

Digital Advertising and the Global Privacy Patchwork

“The global privacy patchwork has created significant challenges for participants that work with the digital advertising industry,” says Michael Hahn, SVP and General Counsel from IAB Tech Lab. “We work with multiple partners, the technology is complicated, and the slightest variations in the different privacy laws can result in cascading legal and data management efforts for organizations.“

“Furthermore, the proliferation of different privacy laws and enforcement approaches…create a challenge for the digital advertising industry and how companies can best manage their compliance.”

Joining Hahn to discuss these challenges at the Fall Spokes Privacy Technology Conference was Bird & Bird Partner, Sophie DawsonJessica B. Lee, Partner, Chair, Privacy, Security and Data Innovations at Loeb & Loeb; Mark Webber, Fieldfisher U.S. Managing Partner; and Jessica L. Rich, Of Counsel at Kelly Drye, who previously served as a director of the Federal Trade Commission’s Bureau of Consumer Protection.  Hahn moderated the session, Digital Advertising and the Global Privacy Patchwork.

The Patchwork Challenge

“Like everything, the devil is in the details,” says Lee. “And specifically, for these laws, in the definitions.” Noting that while the bones are somewhat similar with respect to certain baseline consumer rights across California, Colorado, Virginia, and the GCPR, but “when you get into the weeds — particularly in this space where data is so important, they don’t exactly overlap.”

There are so many different data streams and parsing out each category of data, how its impacted by the law, and each data use is where some of these exemptions come into play (like deletion exemptions) aren’t exactly the same. We really have to go in line-by-line and understand when we do have an obligation to delete and that impacts, how you operationalize a deletion obligation.

—Jessica B. Lee, Loeb & Loeb

Lee notes that while companies are starting “to wrap their minds around it,” there will be much work to do in 2022 to measure the impact and develop processes for handling it.

The ability for consumers to limit the use of sensitive personal information; to opt out of having their PII used for any purpose other than that which it was collected is another complication in managing data. One that also constrains adtech and the ability to serve targeted advertising based on say, demographic and location data.

This means, says Lee, “figuring out first, what is the scope of personal information that I’m collecting and how am I using it. And then, what kind of mechanisms, am I going to put in place so that all the rights are applied to the data appropriately.”

“Number one,” advises Lee, “map your data. Companies really need to have a sense of what they have, where it sits, what they’re doing with it, and understand the business impact associated with that data. I think that Foundation will help for any future state laws.”

The hurdles to FTC rule making authority

The FTC can do industry wide studies and reports. It’s done that for broadband providers, data brokers, and social media companies, for example. And it can issue new rules under the FTC Act, even if they’re not mandated by Congress, under its ‘Mag Moss’ authority.

These tools give the FTC a lot of latitude, but there are some real hurdles too.

—Jessica L. Rich, Kelly Drye

She notes that the “FTC has several tools it can use to ramp up enforcement in this area,” including laws against unfair and deceptive practices. But the hurdles are significant.

“Contrary to what is often thrown around by people discussing deception and unfairness, those standards require the FTC to meet certain elements or proofs. Deception has to be material. With unfairness, you have to show a likelihood of significant injury and satisfy a cost-benefit analysis. Also, Magnus-Moss rulemaking is extremely cumbersome…Those rules take years to complete and it’s really not going to be feasible to do any kind of broad rule under Mag Moss.”

In October, Commissioner Slaughter provided remarks, where she questioned whether the notice and consent framework that most companies use, work, and suggested that a data minimization approach is better. And that it won’t “break the Internet” because there’s the alternative of contextual advertising. Is that a widely held view in the FTC?

—Michael Hahn, IAB Tech Lab

“The concept of data minimization isn’t new and criticizing notice and choice is not new,” reminds Rich. “This is stuff has been talked about for years. Data minimization is best practice, and it can be an element of the data security violation under FTC.”

“While it is a particularly difficult concept for the Adtech industry…the Ad industry can work on parallel protections as to use and sharing. I’ve helped work on those with people in the Ad industry, preventing the use of data for secondary and potentially harmful purposes as data travels through the chain. That’s something that your industry should be working on very hard.”

Unless they’re nuts, I really don’t expect the FTC to try to do a broad privacy rule with hundreds of mandates under its own Mag Moss authority. But I do think they’re going to bring aggressive enforcement and maybe launch some narrower rulemaking around what they’re calling surveillance.

—Jessica L. Rich, Kelly Drye

When Adtech woke up to a new world

“Adtech had grown fast. And not necessarily with privacy in mind,” says Webber. “And certainly not necessarily with European privacy or the GDPR in mind. There were a lot of businesses at the heart of adtech, that were U.S. or non-European centric, and the prospect of large fines (2% or 4% of global turnover) began to make some think about it. The GDPR really asked everybody to look inward at what data they had and why they had it.”

As Webber notes, three significant things the GDPR did was redefine personal data, a processor, and introduce extraterritorial controls.

“First of all, many ad tech providers considered themselves processors acting on behalf of somebody else,” notes Webber. Furthermore, “Many of those organizations considered themselves outside of the scope of the jurisdictions GDPR, but the GDPR had an extraterritorial effect. A business adtech business in San Francisco was now conceivably within the bounds of European law.”

The GDPR also clarified what was considered “personal data” going beyond then current European legislation, to include indirect identifiers (i.e., cookies, device IDs, IP addresses, and other indirect identifiers).

“So, the adtech industry lost that ‘we’re not processing personal data anyway’ argument. Suddenly that collection, use, disclosure, dissemination, and general uses of data now required that adtech processing was fair, lawful, and transparent. A new concept to think about. When you’re thinking about adtech you’re thinking about two things:

  1. The collection of data, that is, how do you get it in the first place, and
  2. The use of that data.

What we do know is we need transparency, and clear, open, and honest use around the way data is collected, used, and shared, and that’s really what we’ve been struggling within Europe ever since,” says Webber.

Since the introduction of the GDPR and subsequent legislation and enforcement, the adtech industry has learned quite a lot opines Webber. “The problem is,” he says, “is a lot of that has been local guidance. Although there’s a lot of commonalities,” the French, the UK, and Germany (late to the game), for example, have a very different view around opt-out/opt-in.

Webber also notes that case law provided much-needed insight. “Three cases come to mind:”

  1. Fan Pages. Basically, an organization running a fan page in your own home page on Facebook is now seen as a joint controller and jointly responsible alongside Facebook for the collection of data.
  2. Fashion ID, coming from the CJEU and also concerning Joint controllership. It involved website publishers using social plugins (e.g., share on LinkedIn) and made clear that if you run a website as a publisher and use third-party plug-ins, you are jointly responsible for the collection.
  3. Planet 49. In this situation, cookie acceptance was pre-checked. The CJEU said this wasn’t going to be enough. You needed player affirmation.

All focus on transparency. “It’s important to look at what other technologies are alongside you, and that there might be some joint responsibility.”

Australia on its own

Sophie Dawson: “We have our own privacy laws which are not exactly like anybody else’s,” says Dawson. “Our piece of the patchwork quilt is very much our own.”

Privacy is undergoing two reviews in Australia, says Dawson. An exposure draft bill that would give power to the Privacy Commissioner to put in place a code with specific rules applicable to social media large digital platforms and data brokerage services. The second is a broader privacy review process and submissions.

The Australian story picks up on the global themes we’ve heard today, which are, firstly, that it comes very much from a competition, as well as a privacy focus. The current reforms stem from a digital platforms inquiry report done by our competition regulator, which was instigated in 2017. So it’s been quite a long conversation in Australia.

—Sophie Dawson, Bird & Bird

“Flowing from that process, we see proposed privacy and consumer protection reforms in the adtech space focused on transferability and promoting competition,” suggests Dawson. “There’s a focus on notice and consent, but there’s also a concern that that may no longer be the appropriate approach. Our regulators are talking about [a] move to more focus on what’s fair and reasonable from the regulator’s perspective, rather than it being all about consumer control.

The exposure draft bill would do three things:

  1. Increase the penalties for serious and repeated breaches of privacy including increasing the maximum penalty to AU$10M or 10% of annual revenue or the benefit received whatever is greater,
  2. Extend the extraterritorial operation of the Australian act so it will no longer be confined by reference to Australian sources of data,
  3. And, most importantly from an adtech perspective…it empowers the policy commissioner to make a code and requires that that code does things like introduce the specificity and informed nature of consent. So, it could have a real impact on the current practice.

The contemplated code would also enable individuals to request that their data no longer be used or disclosed which is not an existing right. There are “a lot of changes which are very relevant to adtech: changes to the personal information and technical information definition sometimes relied on the in the adtech world,” warns Dawson.

There’s a lot of potential changes around notice and consent, but there is a recognition that part of the GDPR feedback around consent and notice fatigue.

So the review is also looking at possibilities of requiring participants in the adtech environment with large data sets to assess privacy risks and manage them instead of a holistic duty-of-care type approach and the possibility of a rule saying that they must ensure that collections, uses, and disclosures are fair and reasonable.

—Sophie Dawson, Bird & Bird

“It is really important when you’re dealing with an Australian notice or consent,” continues Dawson, to actually step back from the privacy requirements and think does this make sense to a consumer? Is there a competition risk in addition to their privacy risk?

Watch the entire SPOKES session here.

  • Privacy
  • Regulations

“Explainability” for AI and ML in Data Privacy

How to Implement “Explainability” in Emerging Global AI/ML Regulations

 

Explainability is defined in various ways but is at its core about the translation of technical concepts and decision outputs into intelligible, comprehensible formats suitable for evaluation.

—Fjeld, et al., 2020 [1]

Artificial Intelligence (AI) and Machine Learning (ML) are opaque: even for the scientist creating and working with these systems. As award-winning AI researcher and IBM Watson team founder, David Ferruci, opined, AI may be building intelligence that “may have little do with how to engage with, and reason about, the world. And critically, “How useful is that model when we need to probe it and understand it?”[2]

The need to understand automated decision-making is increasingly a concern in the privacy policy environs. If scientists struggle with it, how is it made “intelligible” to the average consumer? Furthermore, “explainability is not exactly the same thing as transparency,” informs Future Of Privacy Forum’s Lee Matheson. “And it is a concept that comes up a lot, specifically in connection with AI automated decision making and machine learning (ML) technologies.”

To discuss the challenge of AI explainability in the context of data subject rights, Matheson moderated a panel of privacy and AI experts at the Fall Spokes Privacy Technology Conference. Joining him, were his FOPF colleague Dr. Sara R. Jordan, Sr Researcher, Artificial Intelligence & Ethics; Christina Montgomery, Vice President & CPO, IBM; and Renato Leite Monteiro, Co-Founder of DataPrivacy Brazil, who was closely involved in the drafting of Brazil’s General Data Protection Law You can view the conversation, “Explaining AI: How to Implement “Explainability” in Emerging Global AI/ML Regulations,” here.

Explaining Explainability

I key in two words that they use to support the definition: intelligibility and comprehensibility. And making something intelligible really depends on the target audience that you’re trying to make it intelligent for. And then comprehensibility depends upon the audience’s ability to take that information…on board and to do something with it.

—Dr. Sara R. Jordan, Future Of Privacy Forum

“My idea of explainability is much more than providing an explanation of the technical concepts,” says Monteiro. “It is allowing data subjects to use the information provided to them to possibly challenge automated decisions to guarantee their fundamental rights.” He sees four limitations to this:

  1. The technical;
  2. Legal limitations involving commercial secrets and IP;
  3. How to translate technical concepts for the data subject; and,
  4. Institutional. Who is going to enforce the explanation?

Montgomery views explainability “as information related to the particular decision and why the decision was made….a way to enhance transparency about what went into the system, how it was trained, the data that was used, and how it’s deployed related to the decision itself.”

“[But] being able to provide meaningful explanations to an individual user or to the data scientist that’s developing the AI application, those are very different things.”

And context matters, she says. Some applications are very high stakes and the level of granularity required would be very different from what you’d expect in a low-stakes context like retail.

Explainability and existing privacy regulation

“When you look at regulation like the GDPR it’s addressing personal information in a technologically neutral way,” notes Montgomery. “It’s also covered from the perspective of the main underlying principles of the GDPR like the concepts of lawfulness, fairness, and transparency.

“The GDPR, with respect to AI specifically, is only in the context of an automated decision without human oversight.” As such, there’s a very limited subset of AI systems, to which the GDPR applies.”

The existence of the right to explanation in the Brazilian General Data Protection Law (GDPL) is the core of Monteiro’s  Ph.D. research.

“I go to Articles 13, 14, and 15, and recital 71 of GDPR and so forth,” says Monteiro,  “where it is not enshrined in the regulation. If we are to advocate there is a right to explanation in the Brazilian legislation, we need to derive that from other sources.”

I think we can say that there is a right to explanation under Brazilian law. But there’s one particular concept that is different from GDPR and U.S. Personal data can be anything that allows for singling out the person, but there is one particular article under the Brazilian law that says even anonymized or non-personal data, if it is used for profiling purpose [e.g., demographic data], it will be deemed personal data.

—Renato Leite Monteiro, DataPrivacy Brazil

Simulate-ability, decompose-ability, and interpretation

Jordan notes that there really is really no “academic consensus yet in terms of what explainability is” and suggests as required reading the paper Explainable Artificial Intelligence (XAI) that provides a landscape of explainable AI.” [3]

“What I think is useful about [the paper] is that it does try to decompose the idea of explainable AI into at least three parts: simulatabilitydecomposability [and interpretability]:

  1. Simulatability is the ability of you to think as if you’re the computer– can you reimagine what it means to try to make this decision – if you were given infinite time, space, and the ability to compute this What would you come up with?
  2. Decomposability is the ability to break down what a system does. What happens to the data about you? How is it brought together? How is it turned into features? How is it brought into systems and manipulated to get to some sort of output?
  3. Interpretability of AI. Is it something that you can take on board and that you can do something with in your life?”

“If there’s one area in which I think there is emerging academic consensus”, says Jordan, “it’s that interpretability and explainability is audience-specific…”

Explicability in practice

At IBM “we’re focused on building the capabilities through something we call a governance into those into our general-purpose AI systems,” says Montgomery. One example offered is the ability of IBM products to automatically produce “fact sheets” across the development and deployment process that creates a log about data being used, the tests conducted, and outputs.

[Fact sheets are] reflective of the model development lifecycle and also capture things like bias and privacy measurements. The goal is ultimately to produce something that will be able to provide transparency…and will be tailored and context-specific.

Say it’s in the context of a loan application that helps support decisions… the loan officer would get a different explanation – a different fact sheet – than the data scientists who created it.

—Christina Montgomery, IBM

But it’s important to remember that “the AI ecosystem involves many players throughout the lifecycle and it’s very complex,” cautions Montgomery, “so you can’t look at this as a simple supplier environment as you would in a typical product.”

As a company, and leader in AI, IBM “thinks about guardrails all the time and whether they can be built in technically” as part of their AI governance model, she continues. “There are technical abilities to do that, but we don’t think about just the technical. It is thought about holistically.”

AI explainability for regulators

To talk to a regulator, you have to remember what their purpose is. We’re not talking to a general-purpose regulator. We are talking to regulators within particular verticals that have a particular remit [and require the AI to provide] a sufficient explanation for them to be able to fulfill their remit.

This addresses the challenge of explainability, within use contexts, but also the individual. If you look at the difference between what regulators need and what individual people need.

—Sara R. Jordan, Future Of Privacy Forum

And, as Jordan rightfully notes, there is sometimes a massive gulf between what regulators and the general public require in terms of explainability.

“This is not to say that there is an absolute divide, but rather, keeping in mind that they have different purposes. Individuals need to protect themselves, their families, their communities. And regulators need to protect all of those and within a particular vertical.

“So, it’s going to be very narrowly tailored. Trying to do something extraordinarily general on explainable AI – it’s theoretically necessary – but it may not be practically useful just yet.”

The black box problem

Echoing Ferruci’s concerns, Monteiro notes that “Some technologies will, by design, not be explainable. And that’s one of the things that we need to discuss if we want explainability-by-design embedded in it.

“This is useful for understanding the difference between transparency obligations (ex-ante) and explainability (ex-post) obligations.

“When we are talking about ex-ante obligations,” explains Monteiro, “we are talking about transparency, in order to say what data is used, what are the inputs creating output, and so forth. When you’re talking about ex-post obligations, we are talking about explanations and accountability measures.

Explainability – even on non-explainable systems – can work, not only for the data subject to challenge that particular decision, but also to provide the regulated enforcement authority enough comprehensibility of the system to check if it follows the compliance obligations.

One of the ways to do this is to provide the purpose and impact that particular automated system can have on people. Also, if you cannot properly explain the technical concepts of it, can you demonstrate the impact if you change some of the variables?

—Renato Leite Monteiro, DataPrivacy Brazil

Matheson raises an important point: “It’s one thing to provide a counterfactual explanation for a data subject to illustrate how a given system would produce outputs given certain inputs, but how does that work for someone who wants to exercise, for example, a right to challenge a particular decision? How do you insert a human into a process?”

Montgomery cautions that “there should always be a human in the loop,” and I think that’s how you, in part, get around this idea that maybe you won’t be able to fully explain the output of every decision in a sufficiently satisfactory way.” Particularly if the output is a recommendation.

Perhaps, but as Jordan highlights, “one of the key things to remember here is most of the systems are just not explainable as yet.”

Watch the entire SPOKES session here

  • Privacy

Privacy Operations in Practice

Of particular interest to me right now – in a program with an early level of maturity – is the challenge socializing privacy substantively among business counterparts who are engaging with the data more directly, as well as socialization of privacy at the leadership level where you need buy-in for expanding the budget or collaborating with other business units.

– Nada Bseikri,
Apple Bank Vice President and Senior Counsel (Privacy)

Irrespective of company size or a privacy program maturity level, operational challenges present themselves on a near-daily basis. This is hardly surprising given the organizational and process complexities of meeting even baseline compliance objectives in a constantly changing regulatory environment.

Not all the challenges are privacy-specific. Business fundamentals apply. Like project management; developing core principals and supporting frameworks; and of course, the need for buy-in from the C-Suite and across the organization. That said, as privacy operations expert at WireWheel, Virginia Bartlett notes “organizational cultures of companies are very different in the way they achieve projects and run programs.” Solving privacy operations challenges will not be a one-size-fits-all affair.

Importantly, says Rebecca Shore, VP and Chief Privacy Officer at Albertsons® “How do we find a way to have a strategic position operationally so that you’re less reactive to every little change that pops up and you’re really thinking more proactively about the principles of privacy and how you operationalize them more uniformly.“

Shopify’s Associate Legal Counsel Regulatory Affairs and Enforcement, Rachelle Bastarache joined Bseikri and Shore at the Fall Spokes Privacy Technology Conference, to discuss these challenges and how their respective organizations address them. The session, Privacy Operations in Practice, was moderated by WireWheel’s Bartlett.

Partnership is more important than reporting structure

It’s interesting to see the strategy that organizations take. There’s rationales for all of these models and why they sit under certain umbrellas.

Part of the process and building out this program has been creating a bit of an identity; a bit of a presence.

– Nada Bseikri, Apple Bank

Reporting structures for privacy teams are varied. At Shopify, the regulatory affairs team, government affairs team and the litigation team all fall under the privacy umbrella. At Apple Bank, the privacy function reports up to the GC. At Albertsons, privacy sits within digital technology and innovation which is part of legal.

All note that throughout their privacy careers their reporting structures have varied: within risk management, legal, cybersecurity, and also as an independent function.

Bseikri notes that reporting structure notwithstanding, “We now have a standalone privacy policy, whereas previously, commitments and obligations we’re spread across different policies and areas.” And crucially, “governance, policymaking, and committee involvement across the bank have privacy subject matter expertise representation.”

What I have found really successful, is when I was in risk management, I had my legal partner that was hand-in-hand, side-by-side with everything that I did. When I was within legal, my cybersecurity team was hand-in-hand with everything I did.

It was really about the partnership. How I reported in didn’t necessarily impact the value that I provided to the organization. It was the partnerships that I was able to create throughout my tenure.

– Rebecca Shore, Albertsons

“I absolutely agree,” says Bseikri. “Partnership is more important than the organizational structure. And privacy advocates in your organization can come from unexpected places.”

What’s the glue? How do people stay engaged?

The joke in tech is compliance is this dirty word. You don’t want to be the one to bring up compliance because it’s seen as the nemesis of innovation. So how can we ensure that [privacy] principles are being taken into account in a way that feels inviting rather than closing off innovation.

– Rachelle Bastarache, Shopify

“What keeps people together?” asks Bartlett. What’s the glue? How do people stay engaged? How do you find the right people to be on teams?”

“There are two factors,” opines Shore. One is speaking to people as individuals and not necessarily within their role in the company. For example, ‘You have children. Have you put a security freeze on their social security number?’ Getting them excited about what it is to be in the privacy space.”

She goes on to note that it also helps identify champions. “There are people out there who love privacy and just haven’t been made aware of the fact that what they love is privacy….That you’re not just putting out technology but also advocating for their experience and how they feel about something.”

One of the things that is interesting about the privacy nerd rhetoric is even when I was in law school [in 2017], we weren’t really talking about privacy the way that we [are now]. And I do think that the narrative and telling the story get people excited about privacy. You’re on this new rocket ship that didn’t exist before and it’s this invitation to become a bit of a pioneer in this area.

At Shopify, that narrative happens every day. And it works.

– Rachelle Bastarache, Shopify

“Beginning from a place where each individual has their own privacy considerations – that humanization or that individualization – makes it much more approachable when you’re thinking about a customer base,” suggests Bseikri. Tying in the personal keeps people engaged. It’s not just a compliance activity.

She goes on to note that the personal connection is a great way to find those “unknown privacy advocates” in the business who will want to align with the privacy team on particular initiatives.

Operationalizing the privacy glue

I had just integrated an international operation through an acquisition into a large company that was very U.S.-centric until that point. I did a series of monthly video clips in multiple languages where folks from other countries talked about what privacy meant in their country and why the culture around it was different. That personalization really helped integrate the practice. Before that, it was “we’ll never make this work!”

– Virginia Bartlett, WireWheel

Shore offers that “meeting people where they are (such as slack channels)” is important. One of the biggest moments for me was when there was a major breach, and I wasn’t the first one to write about it [on the channel]. People were asking ‘what do I do?’ I was so excited by the fact that the conversation wasn’t just driven by the attorneys in the room.

Bseikri says to engage with folks throughout the bank, she launched a monthly newsletter. She is quick to add she knows it “sounds like something that shows up in your inbox and you click delete. But folks were really engaging with the content. We had recurring sections, such as the privacy concept of the month, and it resulted in a lot of communication. It helped to create a privacy framework as more than just a compliance exercise.”

“We have a Security and Privacy Awareness Week,” offers Bastarache, “where we have… it’s funny to say…but prizes for different things. People love prizes, it doesn’t matter what the prize is, if there is a chance to win something, we engage.

For example, submit a DSAR request to see what information we have about you and delete it, access it, whatever it is you want to do, but interact with the tools we’ve built. Interacting with nice, sleek tools always gets developers interested. It has been a good way for us to have people engage with privacy in a way they otherwise wouldn’t have.

– Rachelle Bastarache, Shopify.

When key people, and their knowledge, leave

Privacy is a booming industry and privacy professionals are in high demand. This results in high turnover. While this is not a unique phenomenon to privacy, it is particularly vexing and disruptive to a nascent program when knowledge walks out the door.

The solution? “It’s really making sure that you’re building out a program, being strategic, and documenting that program so there’s a clear understanding of what was done,” says Shore. “It’s maintaining the mindset that I may not be here in a year.”

I’ve been on the other side of it, picking up the pieces when someone does leave. The number one thing the last few years has taught me is that continuation from the company perspective is crucial.”

Last week my boss, the CEO, left for a new role. What made it manageable was that the strategy was clear. We had been really involved hand-in-hand. So when I inherited the balance of the program portfolio, it wasn’t foreign to me.

Avoiding silos so that people remain informed will set you up for more success when somebody does leave.

– Nada Bseikri, Apple Bank


Key Takeaways

“Be strategic,” says Shore. “That’s going to provide the biggest benefit to how you move forward, identify relationships, and focus on trust. Build out your strategy, your playbook, and define who you want to be as a program and what message you’re sending out.

For Bseikri, it’s the value of socializing privacy. “It’s very easy to underestimate the value of socializing privacy to shift culture,” she says. Both at the business unit level and at the leadership-level.

“Don’t take for granted that everyone has the same investment in privacy outcomes.”

“Build your narrative,” says Bastarache. “Having people engage with the narrative of trust rather than compliance has worked wonders for us at Shopify.”

Watch the entire SPOKES session here.

  • Marketing
  • Privacy

Privacy in the Metaverse

It’s inevitable that the metaverse will be the number one social network in the world.

— Michael Gord, Metaverse Group Co-Founder

“It wasn’t that long ago that the metaverse concept didn’t exist outside of science fiction books,” says WireWheel CPO Rick Buck. “The term is often associated with Neal Stephenson’s Snow Crash or Ernest Klein’s Ready Player One. But the metaverse is far from science fiction. It’s here today and it has captured significant interest in the investment community.

“It’s a new channel, a new technology, a new industry, with enormous potential. But, if we think of the Metaverse as an extension of the Internet, then we also need to think about the problems and issues that we’re dealing with on the Internet: challenges that can be exacerbated in this new metaverse world.”

An expert panel gathered to discuss the metaverse from both a privacy and investment perspective during the Fall Spokes Privacy Technology Conference. Jennifer Vancini, General Partner, Mighty Capital; Christy Steele, Principal, Sands Capital; and Jeremy Greenberg, Privacy Counsel, Future of Privacy Forum joined WireWheel CPO, Rick Buck for Privacy in the Metaverse: What You Need to Know Today.

 

Is the metaverse big enough for new entrants?

It is really a matter of how many users you have on your platform. How much engagement do you have? While there is a high costs to build this type of platform and retain users, at the same time the reverse is true. There are very low switching costs for users who will sign up very quickly for a new platform to give it a try.

— Christy Steele, Sands Capital

“It comes down to the company’s model” says Mighty Capital’s Vancini. The fundamentals apply: “What’s unique about the team in terms of achieving what they want to achieve? Are they best of breed of something that they’re doing that will contribute to this metaverse?”

“We look closely at dependencies on APIs. What is the likelihood that they’re very dependent on one particular giant and the risk that…they decided to cut access or frequently change what that API looks like? What is that company’s ecosystem strategy? I want to see you’re covering 50 to 100 different things so that you have different areas to grow in different areas to exit and got acquired.”

“The user experience and the quality or the diversity of experiences the user can have is really important,” opines Steele. “We’ve seen platforms build incredible user bases very, very quickly, but that doesn’t mean it will last. We look for tools that work across a multitude of platforms that will empower users and businesses across multiple – and any new platforms – that exist.”

Conversely, “there may be advantages to the small nimble newcomer. Particularly those with a highly specific focus that draw users,” offers Steele. An associated risk, however, warns Vancini, is that “if you’re too specialized, the company looks more like a feature than a full product” and it is “way too easy for the big players to replicate it.”

“Privacy is one area where they can maybe have a leg up because it’s a very specific context for why they’re communicating with you. Continuing to offer additional value in an ongoing relationship.”

Privacy in the metaverse

“A lot of the concerns involving privacy relate to the scale of data being collected, a lot of it being sensitive,” says Greenberg, noting that “studies have shown that 20 minutes of VR can generate up to 2 million unique data elements related to an individual. And that’s just virtual reality. If you have a metaverse that’s integrating a number of technologies – VR, augmented reality (AR), mixed reality (MR) –  all of these different sensors are potentially collecting vast amounts of sensitive data relating to the human body and unique to the individual.”

The privacy risks are then magnified when decisions are made, based on the collection of this data. In the ‘2D’ web, information is collected to build a user profile that decides which content to serve or not serve a user. Which experiences they will have or not. This can cause things like filter bubbles and create all sorts of divisions politically or culturally.

But in a 3D environment, those decisions are not just online or on your mobile device. They’re potentially happening to the world around you that you’re walking through and experiencing.

— Jeremy Greenberg, Future of Privacy Forum

Importantly, as Greenberg warns, “a lot of this technology is going to be used in public, so there are all sorts of questions about transparency and control over data collection of bystanders who might come into contact with someone using this technology.” The implications of this are both complex and profound

Does it feel sort of icky?

As Vancini rightly points out, “knowing past history, there’s a little bit of suspicion around the metaverse along with the excitement and curiosity. We know it’s about advertising dollars. However, it’s too early to know what the value of this data in this advertising will be.”

In terms of diligence –with every company that’s dealing with consumer data – the first test is simply ‘How does this feel as a potential user or consumer of the product or platform?’

Does it feel sort of icky or inappropriate? You don’t want to be involved in that type of product or platform.

— Christy Steele, Sands Capital

“It’s also important with the founding team,” continues Steele, “to get a sense that they are aware of the privacy challenges and understand that this will be an evolving challenge. They need to be as transparent as possible with their users about what they’re collecting and how they’re using it.

“I think the transparency and control element of this piece is really important,” says Greenberg. “Of course, the onus falls on the user. Granular controls can be great but can also be overwhelming. It is really important to think about some current privacy-enhancing technologies or techniques and mapping them to this 3D space.”

“Just having a privacy-first mindset is very helpful,” offers Vancini. “There’s certainly the opportunity for business models around privacy as part of the product value that you’re giving the consumer or the organization.

What is Government’s role?

The beginning of wisdom is the definition of terms.

— Socrates

As Greenberg notes, legislation right now is “an uphill battle.” But in terms of the metaverse – involving all these different technologies – “we don’t know what it’s going to exactly look like. We don’t know what tech it’s really going to entail.

“When we talk about things like virtual reality there’s more or less consensus over what that is, but we don’t have standardized definitions.

Just really thinking about the definitions we have today – biometric data is often defined based on the ability to identify a user – the privacy risks in the metaverse…also concerns the inferences you can draw from the collection of this biometric data, [such as] my psychology based on eye-tracking data, for example.

Rethinking some of the definitions we have out there is really just a first step.

— Jeremy Greenberg, Future of Privacy Forum

Vancini calls out that ”these are clearly difficult areas because what’s the balance between the government, the oligopolies, and monopolies and then the end-users? It’s impossible to say it’s really a free market in terms of the content being served up to us. Our metaverse has come down to a very tiny microverse and there are risks inherent in that.”

Things to come…

As Steele notes, “we all have a vision in our minds, whether it’s from Snow Crash or Ready Player One of this virtual world that we’re living in all the time. I think, in reality, the way that we look at it as it stands, is that there’s this continuous overlapping of our physical world and digital world that gets a little bit more augmented where we’re continually uploading and downloading more information from the virtual world.”

It is human nature that we overestimate what we can do in the short-term and underestimate how much can be done. In the long term. Three years from now, maybe we’ll see a couple of breakout cases and disruption that are super interesting. 10 years from now we’ll be talking about a completely different set of companies.

So let’s just say some of the mega giants we know today could be the Yahoos of tomorrow.

— Jennifer Vancini

Greenberg predicts that in the short term the “ metaverse is going to be really idiosyncratic.” And 10 years from now “you’re going to have broad adoption…Folks who haven’t heard of any of this today are going to be participants…We are going to move between physical and digital worlds seamlessly. To the point where we might not even be cognizant that we’re doing so.”

WireWheel’s Buck closes plaintively: “Let’s hope that evolves into a utopian versus dystopian world.”

Watch the entire SPOKES session here.