• Privacy

Measuring What Matters: How to measure privacy’s impact

read

In the last decade, the ability to collect, analyze, process, and monetize personal information has increased at an unprecedented rate. The growth rate of the marketing technology sector (“Martech”) – from 150 available solutions in 2011 to 8,000 at last count – is a testament to this. Most recently, Apple’s introduction of the “privacy nutrition label,” and the resulting press, spotlighted data collection state-of-the-art to consumers who resoundingly have been voting no thanks when asked to opt in.

Global privacy legislation and industry best practices are attempting to keep pace with the continually evolving technology and consumer attitudes. Privacy rights that manifest as, for example, notice and choice, transparency, and third-party accountability are fundamental precepts that are at the heart of these laws and best practices.

The questions consumer-facing brands and publishers need to ask themselves, says Sentinel President, Aaron Weller, are: Can we accommodate these obligations? Do we have insight into the progress we’re making on our privacy program’s maturation journey? Do we understand what’s important and how to measure that progress? Do we know which metrics to use to best measure progress and effectiveness?

And importantly, can you communicate those findings. As Weller notes, even if you have the best metrics, the ability to communicate the information (and insights) – to the board, peer leadership, teammates, and regulators – is essential to achieving the desired outcomes.

Arron joined eBay’s Ana Zeiter, General Counsel and CPA, and WireWheel CPO Rick Buck to discuss how to use metrics to develop organizational goals. To build a cohesive story about your privacy program that enables you to measure, communicate, and understand the current state of that program and what needs to happen to get it where you want it to be. Their presentation was hosted by the IAPP and can be accessed here.

“What gets measured gets done.” —W. Edward Deming

One of the real challenges, when we’re looking at metrics, is that if it’s easy to measure, it really doesn’t tell you very much. but if it’s really useful, it’s actually often fairly challenging to measure. We need to think about what we can measure today, but also, those things we want to measure. Where do we want to be in the future and how can we use metrics to help us get there?

—Arron Weller

Metrics are only as useful as the insights they provide. And those insights are only useful when they are actionable in such a way that they move the privacy team closer to its goals. Information for information’s sake has no value. As a general framework, the data you gather to guide your privacy program should follow the “SMART” paradigm: it should focus on specific, measurable, achievable, relevant, and timely information. And those findings should be actionable.

Weller cautions against what he calls “check engine lights.” An indicator that tells you something happened, but not what needs to be done about it or how the issue could have been prevented. (Except to find someone who can.) It is a not very SMART indicator

As Lynn Bird, who leads Microsoft’s privacy program said about her program at the recent IAPP Global Privacy Summit, “We try really hard to use data to determine where there’s actual risk…If you don’t have discrete data points, you’re actually robbing your program of an opportunity to mine that data and determine where you’ve got risk….That’s where you have an opportunity.”

eBay’s Ana Zeiter concurs, and points to the critically of being able to interpret that data – particularly trend data – in a way that guides potential action(s) to mitigate any emerging risks.

She relates that when the CCPA came into effect, eBay, not unexpectedly, saw an increase in the DPO inbox as people — due to media coverage – were very interested and were asking about privacy. Zeiter used this information to ensure eBay had the resources to answer user requests in a timely fashion. But by 2020, reporting indicated that the current team was nearing dangerously close to capacity:

We saw, at the beginning of 2020, that the team almost reached capacity, which indicated that in the future we needed to be prepared for the CPRA coming into force when the peak would be even higher. We needed to either implement automated processes or hire additional people to make sure…we could flex our team capacity accordingly.

—Ana Zeiter

“There is nothing quite so useless as doing with great efficiency something that should not be done at all.” —Peter F. Drucker

The value of metrics to managing any program are significant. However, to ensure that value, choosing the right metrics is crucial. As is discerning the processes and priorities that should command your attention and resources.

The things that you choose to measure really drive what gets done within a program. When you pick the things that you’re measuring, you are in many cases driving those things being done which might not be what you actually intended…If we’re measuring the number of policies that one writes every quarter, you’re incenting incentivizing writing policies, not necessarily the right policies…”

—Aaron Weller

Today, companies have ready access to a significant amount of data that can support robust analytics. And there are a plethora of tools to support those analyses and create sophisticated and effective reporting and management dashboards.

Again, there is an overwhelming amount of information. But not all data is information and not all information is useful. Indeed, as Weller implies, it can even be harmful if the metrics you choose misdirect focus. Here the choice of an effective privacy operations management platform with native reporting capabilities will serve well to keep the focus on the critical privacy metrics and key performance indicators (KPIs).

In this sense, Aaron says, “measuring a program really is the management of the program.” And as such “we can use measurement both to get the things done that we want to get done, but also to identify areas where maybe we don’t need to be doing something at all.” In other words, if a CPO is looking at metrics to increase the effectiveness of privacy operations and improve outcomes, the first question to ask is should the team be expending resources on particular activities, and if so, is it a priority?

Of course, regulatory requirements often dictate those priorities.

As those who are subject to CCPA will know, effective July 1, metrics concerning data subject rights must be recorded and disclosed either within their privacy policy or posted on their website. This includes metrics around, not just the number of requests that business received, but also, for example, how many were complied with, how many were denied, and the average time it took to respond.[1]

While these are fairly straightforward activity-based metrics to collect and report, they can and should be used for internal analysis as well – particularly at getting to the root causes of any gaps in desired performance and importantly, rationalizing that performance against differences in the complexity of the task being analyzed.

Take for example tracking raw data like number of privacy impact assessments (PIAs) completed per month. The trend – 50 PIAs completed in May, 60 in June, 40 in July. A top-level activity metric does not factor for complexity. Some PIAs will be rather simple, others may be very complex. More critically, it may be that there are actually thousands of changes going into production every month, but the PIA process is actually operating randomly and well below needed efficiency.

“The single biggest problem in communication is the illusion that it has taken place.” —George Bernard Shaw

When we started our KPI program with GDPR at eBay, we started with very easy activity metrics. We started simply, counting data deletion requests, subject access requests, data protection addendums we negotiated. And over time when the KPI program got more mature, we went from activity metrics into trend metrics and then also [analyzing] outcomes.

—Ana Zeiter

“The pros are it’s easy to capture,” continues Ana. “It’s easy to understand and communicate, but the cons are it’s rarely insightful in and of themselves. You cannot tell a story. You can only say we received 2000 data deletion requests last month. Full-stop. Nobody knows the context.” But it is an important start and as your program evolves you will continue to rely on these basic measurements.

As noted, straightforward quantifications like activity data are required by the regulators: they want to know the numbers. Activity data are essential to rationalizing resource allocations. Quantifications of error rates are a vital KPI to stratifying risk and focusing attention.

Two things to keep in mind:

Firstly, privacy is a process, not an event. And your privacy program, whether you are a large, sophisticated company like Microsoft – “like most privacy programs we’re constantly evolving,” says Lynn Bird – or a small company without so many resources, you should allow that your program will evolve over time as you master the art of privacy operations specific to your organization.

Secondly, and most importantly, it is critical to keenly focus not only on what metrics to collect, but why. And that reason is, as Zeiter notes, is to tell a story. To communicate the insights gained through thoughtful and targeted data analysis intended to improve processes and outcomes. This requires crafting a compelling story appropriate to the intended audience: the board, your leadership peers, members of team and customers. It will take time to master those stories, each one unique to its audience.

Metrics are the grammar of these stories.


[1] (g) A business that knows or reasonably should know that it, alone or in combination, buys, receives for the business’s commercial purposes, sells, or shares for commercial purposes the personal information of 10,000,000 or more consumers in a calendar year shall:

(1) Compile the following metrics for the previous calendar year:

a. The number of requests to know that the business received, complied with in whole or in part, and denied;

b. The number of requests to delete that the business received, complied with in whole or in part, and denied;

c. The number of requests to opt-out that the business received, complied with in whole or in part, and denied; and

d. The median or mean number of days within which the business substantively responded to requests to know, requests to delete, and requests to opt-out.

(2) Disclose, by July 1 of every calendar year, the information compiled in subsection (g)(1) within their privacy policy or posted on their website and accessible from a link included in their privacy policy.

a. In its disclosure pursuant to subsection (g)(2), a business may choose to disclose the number of requests that it denied in whole or in part because the request was not verifiable, was not made by a consumer, called for information exempt from disclosure, or was denied on other grounds.

Future proof your privacy program with WireWheel’s Trust Access and Consent Center and WireWheel’s Privacy Operations Manager.

Request Demo