• Privacy

Data Privacy and Virtual Reality (VR)


In a single recommended 20-minute session in VR, a headset can generate approximately two million data points and unique recordings of body language. Immersive tech can track huge quantities of highly sensitive information, creating both lucrative opportunities for businesses and a wide spectrum of privacy and reputational risks to users.”

—Jerome, 2021

As detailed in a recent report authored by Future of Privacy Forum (FPF) policy council Jeremy Greenberg and Common Sense Media’s Joseph Jerome, virtual reality devices “rely upon the collection…and the processing of sensitive data, including users’ biometric data, unique device identifiers, and location information about the interior of homes and businesses, and more. Without this data, XR[1] technologies cannot function safely and effectively.” (Greenberg & Jerome, 2021).

The exhaustive array of information is gathered by cameras, motion and depth sensors, inward facing sensors (to analyze eye gaze and pupil dilation), and microphones. Data processed includes geolocation information from Wi-Fi, Bluetooth, and IP address. Log files, device identifiers, hardware and software information are also collected. So too are data about applications purchased and used, time spent on using specific features, and various other telemetry data. Your interactions with other users are also recorded.

Surveilling Ourselves

As XR technologies become more popular, developers and XR platforms will gain access to rich new sources of information about individuals…Further, the social mores and norms for virtual worlds and augmented physical space are only developing. HMDs [head mounted displays] raise many questions about privacy, free expression, and evolving digital norms.

—Greenberg & Jerome (page 3)

The immense value of all these data points is not lost on the adtech industry. The opportunity to combine these data with purchase and financial information, contacts, content, search, and browsing history, along with other sensitive data is rightly a concern of privacy advocates like FPF’s Jeremy Greenberg. Facebook’s acquisition of the Oculus Quest virtual reality gaming headset highlights that concern.

Coining the term “biometric psychography,” attorney Brittan Heller “who specializes in technology and human rights with Foley Hoag” opines that capturing information like eye gaze and pupil dilation “is like a ‘Like’ button on steroids” (Kahn, 2021). And what emerges from FPF’s report is that absent significant safeguards, the potential for substantial harm at the individual and societal level looms large.

The Significant Benefits of Virtual Reality

The primary conclusion from the fourth annual Augmented and Virtual Reality Survey…found growing momentum for nearly every area of immersive technology’s use and, importantly, expanding avenues for monetization. Industry experts are increasingly bullish about the wide range of XR applications, from surgical training for medical students to simulated Mars Rover repairs for NASA engineers.

Perkins Coie LLP, 2020

As the report notes, though “today gaming is the primary driver of app development, education and training applications also have seen significant uptick amongst developers” (page 10). “XR technologies are used in fields including gaming, military training, architectural design, education, social skills training, medical, simulations, and psychological treatment, among others” (page 2).

Retail has embraced VR for employee training. Big retailers like Walmart and Verizon are using this technology “to train staff on new equipment, or how to manage a difficult customer situation. The simulation provides employees the chance “to practice in a virtual controlled environment before doing so on the job, ” reports Greenberg.

Other sectors using virtual reality for training are the industrial space and the military. It is easy to comprehend the value of training someone on dangerous equipment like a nuclear reactor, working with dangerous chemicals, flight training, or a military engagement scenario in a simulated environment where lack of experience and error does not result in tragedy.

K-12 and higher education is a burgeoning sector for virtual and mixed reality deployments. Children can be taken on field trips and travel through time to walk the streets of ancient Egypt. Dinosaurs can roam the classroom.

In higher education, there are examples of universities using virtual or augmented reality science labs to give students the opportunity to study in what would otherwise be a cost-prohibited lab or allow potentially dangerous experimentation. “MR headset makers and medical schools promise to revolutionize medical training, diagnostics, and even surgery” (Greenberg and Jerome, page 13).

The Significant Risks to Privacy of Virtual Reality

What are the five most important aspects of VR technology? The punch line: Tracking. Tracking. Tracking. Tracking. Tracking.

—Jeremy Bailsenson[2]

Zeroing in on the risk described by Heller, Greenberg opines that “a lot of the future-looking risks centers around the idea of a user encountering specific content and information and how “pupil size changes in response to content or how long they stared at that content.” This can provide “insight into how interested the user is in that content and can potentially be used in a number of ways.” Eye-gaze data (the “like button on steroids”) in an adtech context could be used to enhance behavioral advertising. In fact, as the report notes, “eye tracking has been termed the holy grail of marketing” (page 17). But this is a relatively benign potential for abuse.

Eye tracking “can lead to conclusions – whether accurate or not – about a user’s age, gender, or race. Gaze can also be used to diagnose medical conditions; eye movements and pupil response can be used to diagnose ADHD, autism, and schizophrenia” and “used to ascertain conclusions about a user’s sexual attraction…With no real capacity to control this data stream, XR users will have to trust this information is used responsibly” (Greenberg & Gerome, page 17). Consider the adverse implications this portends in a work or educational setting.

Trusting that the information is used responsibly is a big ask considering the machinations of adtech and the data industrial complex writ large. To say nothing of the potential for data breaches. Importantly, as Greenberg implies, the interpretation of this data is not necessarily accurate and can assign to an individual erroneous behavior, even thoughts. This raises long-standing and still-unresolved issues that harken back to the Warren-Brandies 1894 HBR article “The Right to Privacy.”

Regulations and Guidance Do Not Comprehend Virtual Reality

Realistic avatars raise fundamental questions about an individual’s ownership and control over their digital identity. With lawmakers and regulators increasingly interested in (1) “deepfakes” and other digital manipulations of audiovisual representations and (2) how developers deploy nudges and other manipulative digital interfaces, it suggests the rules and norms around avatars could become a potential flashpoint…

Greenberg & Jerome (page 19)

Interestingly as recent as the passage of the GDPR, CCPA, CPRA, and the VCDPA are, these “laws were written without contemplating this technology,” says Greenberg. Some biologically related data and biometrics may fall under the CCPA’s or GDPR’s broader definitions, but Illinois’ biometric legislation, for example, provides a much narrower definition and much might not fall under that particular law.

Importantly, what isn’t captured under these laws is this biometrically derived data – Brittan Heller’s ‘biometric psychography’ warns Jeremy. “The inferences or conclusions that could be drawn based on something like my eye gaze and how I’m viewing or perceiving certain content in VR – the inferences or conclusions that could be drawn based on something like my eye gaze and how I am viewing certain content – are really not covered by current law,” says Greenberg.

Furthermore, de-identified data does not fall under various requirements of the law. But “in VR it’s starting to become clear that even if data like head and hand movements is de-identified, it is very easy to re-identify the individual using even rudimentary machine learning techniques like Random Forest[3]

Something else to think about are deletion or access rights. “It will be very interesting to see how this plays out with VR…How do you make sense of that for the user?” How do you really provide access to all of this data?” ponders Greenberg. How would eye tracking data be presented to a user subject to a DSAR request in a manner that is easily understood or correct the ensuing erroneous profile created from the biometrically derived data? And what about data collected about bystanders when using devices in public?

“I would definitely encourage companies and developers to think about the privacy implications, even though they might be in compliance with one of these legal regimes. There are definitely still a number of privacy and ethical concerns,” advises Greenberg.

If law is downstream from culture, technological advancement is a class VI whitewater rapid. Extended reality (VR, AR, and MR) are the latest in a long line of technologies that the law has struggled to comprehend since Michigan Supreme Court Judge Thomas M. Cooley opined that “the right to one’s person may be said to be a right of complete immunity: to be let alone” in 1879.

From Kodak fiends to Glassholes. From Roberson v. Rochester Folding Box Co. et al. (171 N.Y. 538) to New York State’s November 2020 passage of legislation that explicitly extends an individual’s publicity rights to digital replicas. Questions regarding identity ownership, right to privacy, and what comprises the self – constantly evolving – remain stubbornly unanswered.

Manage your customers legal, marketing and compliance preferences with WireWheel’s Trust Access and Consent Center.

Request Demo