ScanGAN360: A Generative Model of Realistic Scanpaths for 360$^{\circ}$ Images (2103.13922v1)
Abstract: Understanding and modeling the dynamics of human gaze behavior in 360$\circ$ environments is a key challenge in computer vision and virtual reality. Generative adversarial approaches could alleviate this challenge by generating a large number of possible scanpaths for unseen images. Existing methods for scanpath generation, however, do not adequately predict realistic scanpaths for 360$\circ$ images. We present ScanGAN360, a new generative adversarial approach to address this challenging problem. Our network generator is tailored to the specifics of 360$\circ$ images representing immersive environments. Specifically, we accomplish this by leveraging the use of a spherical adaptation of dynamic-time warping as a loss function and proposing a novel parameterization of 360$\circ$ scanpaths. The quality of our scanpaths outperforms competing approaches by a large margin and is almost on par with the human baseline. ScanGAN360 thus allows fast simulation of large numbers of virtual observers, whose behavior mimics real users, enabling a better understanding of gaze behavior and novel applications in virtual scene design.
- Daniel Martin (21 papers)
- Ana Serrano (14 papers)
- Alexander W. Bergman (10 papers)
- Gordon Wetzstein (144 papers)
- Belen Masia (19 papers)