Papers
Topics
Authors
Recent
Search
2000 character limit reached

Would Gaze-Contingent Rendering Improve Depth Perception in Virtual and Augmented Reality?

Published 24 May 2019 in cs.HC | (1905.10366v1)

Abstract: Near distances are overestimated in virtual reality, and far distances are underestimated, but an explanation for these distortions remains elusive. One potential concern is that whilst the eye rotates to look at the virtual scene, the virtual cameras remain static. Could using eye-tracking to change the perspective of the virtual cameras as the eye rotates improve depth perception in virtual reality? This paper identifies 14 distinct perspective distortions that could in theory occur from keeping the virtual cameras fixed whilst the eye rotates in the context of near-eye displays. However, the impact of eye movements on the displayed image depends on the optical, rather than physical, distance of the display. Since the optical distance of most head-mounted displays is over 1m, most of these distortions will have only a negligible effect. The exception are 'gaze-contingent disparities', which will leave near virtual objects looking displaced from physical objects that are meant to be at the same distance in augmented reality.

Citations (3)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.