• Privacy

Knowledge Creation and Data Protection

read

How an enterprise data strategy enables both

“When we talk about this idea of knowledge creation, we’re really focused on what it means to do research in a commercial or corporate setting different from scientific research, typically conducted in academia, or in other words,” says COO and Sr. Strategist of The Information Accountability Foundation (IAF), Barbara Lawler.

Importantly, “when we think about it in a data protection context, very often data protection and privacy laws don’t specifically address, in an effective or practical way” what we mean by knowledge creation.”

Lawler moderated a discussion during the December 2021 Spokes Technology Privacy Conference concerning the tension between the importance of knowledge creation or “thinking with data,” and the concepts of data privacy and security. The conversation, Knowledge Creation and Data Protection: How an enterprise data strategy enables both included JoAnn Stonier, Chief Data Officer, Mastercard, and Martin Abrams, the Executive Director and Chief Strategist of the IAF.

Thinking with data vs. acting with data

Simply put, in the arena of data privacy and protection, knowledge creation refers to the process of using data, including data pertaining to people and their actions, to create new insights. However,

Commercial research isn’t usually considered as an explicit legal basis or legitimate interest for data processing. And while scientific research, has some support in the law, it is subject to differing interpretations. But it tends to resolve down to a pretty narrow interpretation of what happens in an academic or university setting conducted by specific individuals and oversight by some type of review board process.

–Barbara Lawler, IAF

But as Lawler notes, “If knowledge creation is about thinking with data or discovering insights in a commercial or corporate setting, knowledge application is the next step: taking action.” For example: Developing a scoring system is thinking with data. Applying that score – for lending or hiring decisions –is the acting with data.

The risk profiles are different. And there is friction when policymakers implementors are not clear on the differences. (A lot of the debate we are seeing is about acting with the data more than thinking what data).

Data protection and data ethics

Mastercard’s Stonier observes that “regulations don’t come at these issues head-on” and as organizations innovate, they “are pretty much left on their own to figure out what is ethical, responsible data innovation. What does that look like from a practical application perspective?”

Data governance and data responsibility, principles are really the backbone of our program at MasterCard and from there, we develop our, principles and guidelines, and ultimately our practices and controls. Then, In the product development process, how we apply those as we actually build out products and solutions. We think about it as part of the design process.

—JoAnne Stonier, Mastercard

Ultimately, when “thinking about what constitutes responsible data practices – as opposed to just privacy or security – comes down to what you are trying to effectuate, “ opines Stonier. And “at the center is the individual. Almost everything that you do in an organization, even in a B2B2C company like Mastercard, the choices made are going to be impactful to the individual.”

Responsible data strategy combines effective data governance, scalable tools, and knowledge.

Responsible data strategy first principles

Mastercard begins with what Stonier calls “a set of first principles:”

  1. Individuals own their data;
  2. Have the ability to control their data;
  3. The ability to understand how their data is being used; and
  4. Should be able to understand the benefits that they get from that data use.

“Individuals obtaining an understand the benefits of their data use is where I think it sometimes comes off the rails a little bit,” says Stonier. “I don’t know that all of us are transparent enough about how we’re using the data. And that’s the harder one.”

“We can try to have privacy notices, we can attempt real-time privacy notices, but that doesn’t always work. The question becomes, how do you incorporate transparency into a product design so individuals can have that understanding?”

Mastercard is known for privacy and security…but when you get beyond that it is really about being accountable for how we’re going to use the data and being willing to talk about it. To be transparent. And this requires integrity in the process – data quality, lineage, accuracy, completeness, consistency – and needs also to encompass innovation practices.

—JoAnn Stonier, Mastercard

“These principles really make a difference as you build practices for each different type of data set and each different context in which you use data. They must suffuse expertise, training, tools, and platforms, all the way through to the sales force. All of us have a role in innovation. We’re all at the table designing together. And when everybody’s part of a design team and you have diverse subject matter expertise, that makes a huge difference in getting to the right outcomes.”

The polarization challenge

IAF’s Abrams instructs on the genesis of this thinking and the current challenges. He notes that “a process took place beginning in 2009 with development – and organizations and regulators beginning to embrace – the essential elements of accountability based on active discussion. This reached a point where the concept of knowledge creation, thinking with data, and acting with data, not explicitly understood by individuals but essential for an innovative marketplace, was explicitly articulated.”

“Policymakers and regulators needed a sense that there was something that was beneficial to society and people that supports this concept of doing big data and the work in artificial intelligence.”

But this concept of that you could separate the risks that come with thinking with data – such as data security or understanding there’s bias related to data – are very different than the activities related to applying the data.

There was this process to differentiate the risks of developing something that is not personally impactful from the risks of making decisions that are personally impactful.

—Martin Abrams, IAF

“Since 2018 we’ve seen a polarization in the way people think about data, which creates risks for organization’s ability to develop and use data,” continues Martin.

“Back in 2016 we would think about the individual’s place in this environment, and it was a question of knowledge and autonomy. It was about transparency and the ability to understand the decisions individuals were making. Today we see regulators who say folks don’t know how to do legitimate interest balancing. We won’t trust the private sector to do research. You have to have complete explicit consent in order to think with data.”

“It’s about the concept of individual sovereignty that is different from the concepts of autonomy. That I have an ownership right in the data that pertains to me.”

Individual sovereignty vs. knowledge creation

“If you’re putting incredible weight on individual sovereignty based on the perception that data extraction (the term used by the FTC),” says Abrams, “you have to shift the concept of what is trustworthy processing to make the balance between sovereignty and data innovation work and leads to a much more restrictive use of data.”

But, as Abrams notes, “If you’re a person in the EC who’s responsible for digital innovation, in the Canadian Government trying to push artificial intelligence as the new economic growth factor, or you’re in Hong Kong trying to create an innovative space, you say that knowledge creation and data innovation is much more important. And the ‘trustworthy path forward’ begins to shift:”

 

“If you’re thinking, what does it mean to do what Mastercard is doing, you have to think about it in terms of step by step. And the first step is every Organization has to engage in responsible knowledge creation. There is no way an organization can be successful today if they’re not using data to improve their products to serve consumers better or create societal value like new vaccines,” suggests Abrams.

“Even if you are the best at innovating in your own industry, the only way we will come up with the best innovations is through data partnerships and through methodologies, where you can do trusted data sharing,” insists Stonier. And “let’s face it data ecosystems are here to stay.”

Responsible data sharing and data practices to create kind of the knowledge that we’re discussing requires responsible practices and it does not mean sharing raw data and it does not mean sharing your sensitive data. It means coming up with practices that are responsible, accountable, transparent, and have a lot of integrity in methodology. There are all sorts of methods that should be explored and do meet the spirit of the regulation.

—JoAnn Stonier, Mastercard

Responsible, trusted, and answerable

Trust is earned in drops and lost in buckets.

—JoAnn Stonier, Mastercard

“To be trusted, organizations have to be responsible and answerable. This is explicitly part of almost every law that’s passed since the GDPR,” says Abrams. “Organizations, need to understand what they’re doing and the impact on all stakeholders that are impacted by the use of data, including research.”

To be answerable, organizations have to be proactive in demonstrating their accountability and stand ready to do so as Mastercard’s principles articulate. Frameworks based on risk assessment and effective data governance will enable beneficial data-driven innovation while protecting individuals and society from potential harms.

—Martin Abrams, IAF

“If we’re asking organizations to be innovative with data and we’re asking the public to trust that innovation,” opines Abrams, “we have to have enforcement agencies that go beyond enforcement to be effective oversight agencies. This requires a different set of skills to be able to determine whether when organizations say they have integrity in their process and the competency execute on those processes so, that in fact, they do.”

From Mastercard’s perspective, “we really need regulators to understand what good practice looks like and how we can actually demonstrate the steps responsible organizations are going through, and how we can even that playing field, so they are similar regardless of context,” says Stonier.

“Only then can we talk about fairness in comparing different organizations trying to solve the same thing. Then you can get to ‘is this fair,’ ‘is this ethical,’ within the same context.”

“I think we have a ways to go. Regulators need to understand that. That’s why I believe in these principles, these practices.”