Eye Tracking in VR Headsets: The Privacy Conversation Nobody's Having
Eye tracking has moved from premium feature to standard inclusion in mid-range and high-end VR headsets. The Meta Quest Pro shipped with it. The Apple Vision Pro relies on it as a primary input method. The PlayStation VR2 uses it for foveated rendering. By late 2026, it’s reasonable to expect eye tracking in most headsets above the entry-level price point.
The technical case for eye tracking in VR is strong. It enables foveated rendering, which dramatically reduces the computational load by rendering high detail only where the user is looking. It provides a natural input method — look at something and select it. It enables more realistic social interaction in virtual environments by tracking where participants are looking during conversations.
But eye tracking data is extraordinarily intimate. Where you look, how long you look, how your pupils dilate, how your gaze patterns shift under different stimuli — this data reveals things about the user that go far beyond what they consciously intend to share. And the conversation about how this data is collected, stored, shared, and used has not kept pace with the technology.
What Eye Tracking Data Reveals
Research in eye tracking and cognitive science has established that gaze patterns are deeply informative about internal mental states.
Attention and interest. Where someone looks and for how long directly indicates what captures their attention. In a virtual store, eye tracking reveals which products the user actually looked at, not just which aisle they walked down. In a virtual meeting, it reveals who the user watched most closely and whose contributions they ignored.
Emotional response. Pupil dilation is an involuntary physiological response linked to emotional arousal and cognitive load. Studies published in PLOS ONE have demonstrated that pupil dilation patterns can indicate whether someone is experiencing positive or negative affect with reasonable reliability.
Health indicators. Eye movement patterns are biomarkers for neurological conditions including early-stage Parkinson’s disease and ADHD. Longitudinal eye tracking data collected through regular VR use could theoretically screen for conditions the user hasn’t been diagnosed with.
This isn’t speculative. These capabilities are documented in peer-reviewed research. The question isn’t whether eye tracking data is sensitive — it clearly is. The question is whether VR headset manufacturers and application developers are treating it with appropriate care.
Current Industry Practice
The short answer is: it varies, and the details are often unclear.
Meta collects eye tracking data from Quest Pro and Quest 3 headsets. Their privacy policy states that eye tracking data is processed on-device for features like foveated rendering, and that Meta doesn’t use eye tracking data from standard features for advertising. However, the policy language around research, product improvement, and third-party application access is broad enough to create ambiguity.
Apple has taken a stronger privacy stance with Vision Pro. Eye tracking data is processed on-device and is explicitly not shared with applications — apps receive only the final input selection, not the underlying gaze data. This is architecturally enforced, not just policy-enforced, which is a meaningful distinction. Applications literally cannot access raw eye tracking data through Apple’s APIs.
Sony’s PSVR2 uses eye tracking primarily for foveated rendering and game input. Sony’s data handling policies are game-specific rather than platform-level, which means the privacy treatment varies by application.
Third-party VR applications present the greatest concern. Applications built for PC VR platforms like SteamVR can access raw eye tracking data from compatible headsets. What the developer does with this data is governed by their own privacy policy, not the headset manufacturer’s.
What Should Happen
Several principles should guide the industry’s approach to eye tracking data.
Minimisation. Applications should collect only the eye tracking data they need for their stated functionality. A game that uses eye tracking for foveated rendering needs gaze direction. It doesn’t need pupil diameter, saccade velocity, or fixation duration. Collecting more than is needed violates basic data minimisation principles.
On-device processing. Where possible, eye tracking data should be processed on the headset and never transmitted to external servers. Apple’s approach with Vision Pro demonstrates that this is technically feasible without compromising functionality.
Transparency. Users should understand what eye tracking data is being collected, how it’s being used, and who has access to it. Current privacy policies are inadequate — they’re too long, too vague, and too difficult for non-lawyers to interpret.
Purpose limitation. Eye tracking data collected for rendering optimisation should not be repurposed for advertising, behavioural analysis, or health screening without explicit, informed, and freely given consent.
The VR industry has an opportunity to get this right before the privacy harms materialise. The alternative — waiting for a scandal and then responding with damage control — is the pattern we’ve seen with social media data practices, and it hasn’t served anyone well.
Eye tracking makes VR better. The technology itself isn’t the problem. The problem is an industry that hasn’t yet developed adequate norms, standards, and safeguards for one of the most intimate data sources any consumer device has ever collected. That conversation needs to happen now, not after the first major breach or misuse.