Culture Matters for Data Privacy
• read
Written by Virginia Bartlett, Privacy Operations Expert, WireWheel
Understanding corporate culture is a critical first-step toward building a data privacy program
During my tenure as a privacy officer in companies across many different industries, I’ve seen differing approaches to data privacy, some more successful than others. Some companies are highly regulated, where the entire company and its business is overseen by very strict regulations, with huge regulatory compliance departments. Yet other companies may be far more relaxed, with little to no compliance structure, and an approach to data privacy that is, shall we say, a little more “relaxed.”
But one thing is for certain: what all of these companies have in common is a corporate mindset that influenced, informed, and ultimately drove how data privacy decisions were made and enforced.
Mindset
Broadly speaking, there are three types of corporate structures that have a direct impact on how data privacy is handled.
The first is a well-regulated, formal corporate structure, usually employing long-standing quality processes. Privacy programs here have a well-defined hierarchy, with a lawyer or a compliance officer sitting inside the legal department, with multiple approvals required to send out an assessment, for example.
On the other end of the spectrum you’ll find the Silicon Valley startup-type mindset, which tends to exemplify much of the tech sector. What these companies are doing with data and information is not really regulated, except through a smattering of consumer protection laws. When it comes to structure and culture of the company, you won’t find an environment where they’re checking or double-checking on how they’re touching data, in fact, you’ll find the reverse.
Sometimes in these companies, an engineer or an IT professional has been assigned to handle privacy, rather than a legal professional, such as you’d find in the more regulated corporate structure. Start-up tech companies talk about privacy in a very different way—they build their programs very much from a systems and application standpoint, leaning on automated tools to make data privacy processes more efficient. These companies do everything possible with data to maximize profits, so the culture of the company is more tech- and data-centric, rather than a culture oriented around data compliance.
Between these two, you’ll find something else—what may be a combination of the more formal and less formal. Here, companies may rely on a data broker, for example, who is responsible for processing information in a way that is compliant with the law, and selling it in a way that is compliant as well. Companies operating under these kinds of arrangement generally do not have a highly structured compliance and regulations team, and the privacy programs might reside within an administrative office, an operations office, or even within the data management office.
Reporting
Knowing your own corporate culture makes a big difference when setting up a privacy program, but how you report privacy issues may also be dependent on company culture—with varying degrees of success. A company with a more rigorous approach to compliance will handle reporting differently from the Silicon Valley start-up. Sometimes reporting might go all the way up to the board, or sometimes there might be no reporting at all.
For highly regulated companies, privacy and compliance may be reported all the way up to the board, as mitigating legal or litigation risks are important to this type of corporate culture, especially at a publicly-traded company.
But Apple for example, takes its approach to privacy as a marketing advantage by touting its privacy controls. Apple is not a highly regulated company, but its corporate culture is such that it has voluntarily decided to more strenuously safeguard customer data—and in doing so, has tied its data privacy reporting to profits. In other words, data privacy funding and reporting at Apple are no doubt reviewed by senior level people, since it’s an inherent part of the culture.
Smaller companies, especially those driven by the IT department, may struggle with privacy issues, simply because their culture is driven by risk based on standards and assessments around ISO, for example. But in the privacy world, there aren’t “standards,” per se, but rather laws—and there’s sometimes friction within the IT world on how you report compliance when compliance is not necessarily a risk.
What’s the Regulatory Environment?
Culture comes into play when it comes to data processing, and how it’s regulated within a company’s particular industry.
A small company, for example, might only be processing data within the United States, and that means something very different from, say, a large corporation operating on an international level that is subject to differing laws across multiple jurisdictions.
See how WireWheel’s Privacy Laws Comparison Table simplifies the journey to compliance for teams working across multiple privacy laws & jurisdictions
Those are all different regulatory jurisdictions that will have differing impacts on the corporate culture and how it views data privacy. A data privacy program must match the regulatory environment of the company—what works for a multinational will not work for a small company only doing business in the US.
No matter the size or scope of the company, it’s critical to what privacy processes and programs are already in place—and where the data is located. Are there any data governance, data management, or information security programs in place? Has any work already been done for security assessments? Get a handle on what’s already in process and build efficiencies from there.
The culture within a smaller company will make it easier to add privacy assessments to existing processes, while a larger company would most likely stand-up privacy assessments on its own.
The regulatory environment for the company might also have some privacy requirements built-in. Companies complying with the Sarbanes-Oxley Act might be conducting a security assessment already, or a financial services company might be required to conduct a security assessment before signing a contract.
In short, the regulatory environment of a company influences its cultural approach to data privacy—and there might be opportunities within these regulatory structures to incorporate additional data privacy measures.
Data Ethics and Scoping
Data ethics is, simply, using data in the right way, according to the permissions the company has been given or granted. Companies will have differing approaches to data ethics according to their culture. Some will, of course, take it more seriously than others. The important thing for a data privacy expert to keep in mind is that there may already be some data processes in place that can be built upon when creating a privacy program that’s a good fit for the company’s culture.
The security team, for example, might already have an incident response process set up that can be expanded upon or built into a new privacy program.
Scoping merely means scaling the data privacy program according to the categories of information the company handles. The larger the company, the larger the program. It’s important to build a program based on the type of data that’s handled, the priorities of that company, and the risks of mishandling that data from a regulatory perspective. Again, the culture of the company will define these parameters. A small company, for example, might elect to do the minimum necessary to comply, especially if they only have five employees and only deal with client data. A large corporation dealing with third-party vendors with offices in Europe (and subject to GDPR) will, naturally, require a data privacy program of a very different scope.
How You Talk About Privacy Matters, Too
How does your company communicate about privacy? Is it reported to a board of directors? Are privacy issues communicated during stand-up meetings for compliance? These are important considerations because how the information is communicated can have a direct impact on the efficacy of that program.
For example, if privacy issues are communicated in ways outside of “normal business operations,” and not inherent to the normal flow of the business process, it can feel like a disruption, and will face an uphill battle when it comes to widespread adoption within the organization.
Yet another important element under the communication umbrella is how a company discusses controversy internally. That is, does your company reward whistleblowers when they come forward with information—no matter how damaging—or do whistleblowers face retribution and condemnation? When it comes to identifying issues or audits within your privacy program, anyone who wishes to report a privacy issue should feel protected to do so. If you’re building a privacy program from scratch, it’s crucial to emphasize that those coming forward with important information will be rewarded, not punished. Some companies have established hotlines where whistleblowers can leave anonymous tips, and those tips go through a third-party for vetting and follow-through.
Keeping those discussions around data privacy open and judgment-free will make the work easier for the data privacy team, and help them to be more proactive.
Prioritizing
Over the last five or six years, we’ve seen that the approach to data privacy has been largely driven by GDPR—with an increase in the depth and breadth of data regulation. That means lawmakers are increasingly looking at the numerous ways data is collected, transferred, stored, and used, and the implications of that. It’s no longer adequate to put a “privacy policy” on a company’s website and be done with it.
In some of the latest laws, as well as those coming down the pike, you’ll find requirements for things like impact assessments and privacy-by-design reviews, or knowing where company data is located. That means that one of the challenges for every privacy program manager is figuring out what to prioritize.
Regardless, taking stock of the company’s culture is the one part of the process that can’t be ignored—and is the first step in building a data privacy program that’s going to be nimble enough to handle the new privacy regulations that are most certainly coming.
Different Cultures – consideration for program scope
Categories of Information
- Regulated Records – Information defined in law as requiring specific protections. Examples: Financial records, Tax records, Customer purchases and activities….
- All Data
- Only Data About People – The definition of “Personal Information” varies by jurisdiction any information that can be used directly or indirectly to identify or contact a person.
- Only Data About Some People – Will my program cover Consumers and Potential Consumers, Website Visitors, Data belonging to my clients, HR and Applicant Data
- Sensitive Data About People – “Sensitive Personal Information” is Personal Information specifying medical or health conditions, racial or ethnic origin, political opinions, religious or philosophical beliefs of an individual.
Categories of Regulations
- Jurisdictional Scope – Will you have one common standard across all regions or approaches that distinguish by region
- Intersection with Security/Record Retention – Security objectives are shared in that security assessments demonstrate personal information is adequately protected. How will you coordinate in documenting adequate protection?
- Vendor Assessments – Typical privacy laws require that your vendors have contracts and manage their programs to your standards. Will your program be responsible for these types of decisions?
- Data Ethics/Responsible Use – Will your program cover AI algorithms and data use ethics? Will you do any kind of social responsibility transparency reporting?
- Incident Response – Will your program be involved with responding to incidents and breaches? Who will report these metrics?