Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Make Interaction Situated: Designing User Acceptable Interaction for Situated Visualization in Public Environments (2402.14251v2)

Published 22 Feb 2024 in cs.HC

Abstract: Situated visualization blends data into the real world to fulfill individuals' contextual information needs. However, interacting with situated visualization in public environments faces challenges posed by user acceptance and contextual constraints. To explore appropriate interaction design, we first conduct a formative study to identify user needs for data and interaction. Informed by the findings, we summarize appropriate interaction modalities with eye-based, hand-based and spatially-aware object interaction for situated visualization in public environments. Then, through an iterative design process with six users, we explore and implement interactive techniques for activating and analyzing with situated visualization. To assess the effectiveness and acceptance of these interactions, we integrate them into an AR prototype and conduct a within-subjects study in public scenarios using conventional hand-only interactions as the baseline. The results show that participants preferred our prototype over the baseline, attributing their preference to the interactions being more acceptable, flexible, and practical in public.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (131)
  1. [n. d.]. Vuforia Tracking. https://library.vuforia.com/. Accessed Aug. 2023.
  2. Are you comfortable doing that? Acceptance studies of around-device gestures in and for public settings. In Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services. 193–202.
  3. Performer vs. observer: whose comfort level should we consider when examining the social acceptability of input modalities for head-worn display?. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology. 1–9.
  4. Exploring the need and design for situated video analytics. In Proceedings of the 2020 ACM Symposium on Spatial User Interaction. 1–11.
  5. Handling Non-Visible Referents in Situated Visualizations. IEEE Transactions on Visualization and Computer Graphics (2023).
  6. Immersive analytics: Exploring future visualization and interaction technologies for data analytics. In IEEE VIS, Accepted Workshop, Vol. 2. 1–4.
  7. The hologram in my hand: How effective is interactive exploration of 3D visualizations in immersive tangible augmented reality? IEEE transactions on visualization and computer graphics 24, 1 (2017), 457–467.
  8. Drawing into the AR-CANVAS: Designing embedded visualizations for augmented reality. In Workshop on Immersive Analytics, IEEE Vis.
  9. ShoeSense: a new perspective on hand gestures and wearable applications. In Proc. CHI, Vol. 12.
  10. Situated visual analysis and live monitoring for manufacturing. IEEE Computer Graphics and Applications 42, 2 (2022), 33–44.
  11. Hybrid tactile/tangible interaction for 3D data exploration. IEEE transactions on visualization and computer graphics 23, 1 (2016), 881–890.
  12. Tangible augmented reality. Acm siggraph asia 7, 2 (2008), 1–10.
  13. Pradipta Biswas and Pat Langdon. 2013. A new interaction technique involving eye gaze tracker and scanning system. In Proceedings of the 2013 conference on eye tracking South Africa. 67–70.
  14. What’s the situation with situated visualization? A survey and perspectives on situatedness. IEEE Transactions on Visualization and Computer Graphics 28, 1 (2021), 107–117.
  15. Data Every Day: Designing and Living with Personal Situated Visualizations. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–18.
  16. Sketching and ideation activities for situated visualization design. In Proceedings of the 2019 on Designing Interactive Systems Conference. 173–185.
  17. The gaime project: Gestural and auditory interactions for mobile environments. British computer Society (2009).
  18. Interaction for immersive analytics. Immersive analytics (2018), 95–138.
  19. ParaGlassMenu: Towards Social-Friendly Subtle Interactions in Conversations. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 1–21.
  20. Understanding the Design Space of Mouth Microgestures. In Designing Interactive Systems Conference 2021. 1068–1081.
  21. iBall: Augmenting Basketball Videos with Gaze-moderated Embedded Visualizations. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 1–18.
  22. Thematic analysis. Qualitative psychology: A practical guide to research methods 3 (2015), 222–248.
  23. Embodied axes: Tangible, actuated interaction for 3d augmented reality data spaces. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–12.
  24. Design space for spatio-data coordination: Tangible interaction devices for immersive information visualisation. In 2017 IEEE Pacific Visualization Symposium (PacificVis). IEEE, 46–50.
  25. Andrew T Duchowski. 2018. Gaze-based interaction: A 30 year retrospective. Computers & Graphics 73 (2018), 59–69.
  26. Tim Dwyer. 2004. Two-and-a-half-dimensional Visualisation of Relational Networks. Citeseer.
  27. Niklas Elmqvist and Pourang Irani. 2013. Ubiquitous analytics: Interacting with big data anywhere, anytime. Computer 46, 4 (2013), 86–89.
  28. Fluid interaction for information visualization. Information Visualization 10, 4 (2011), 327–340.
  29. Blended UI controls for situated analytics. In 2016 Big Data Visual Analytics (BDVA). IEEE, 1–8.
  30. Situated analytics: Where to put the abstract data. In 28th Australian conference on human-computer interaction. https://doi. org/10.1016/j. jvlc, Vol. 6.
  31. Using augmented reality to support situated analytics. In 2015 IEEE Virtual Reality (VR). IEEE, 175–176.
  32. A tangible spherical proxy for object manipulation in augmented reality. In 2020 IEEE conference on virtual reality and 3d user interfaces (VR). IEEE, 221–229.
  33. Envisioning future productivity for immersive analytics. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. 1–7.
  34. Grand challenges in immersive analytics. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–17.
  35. Uplift: A tangible and immersive tabletop system for casual collaborative visual analytics. IEEE Transactions on Visualization and Computer Graphics 27, 2 (2020), 1193–1203.
  36. Candid interaction: Revealing hidden mobile and wearable computing activities. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. 467–476.
  37. Knowledge-based augmented reality. Commun. ACM 36, 7 (1993), 53–62.
  38. Leah Findlater and Joanna McGrenere. 2004. A comparison of static, adaptive, and adaptable menus. In Proceedings of the SIGCHI conference on Human factors in computing systems. 89–96.
  39. Usability Engineering. 527–554. https://doi.org/10.1007/978-3-540-70552-9_21
  40. Towards usable and acceptable above-device interactions. In Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services. 459–464.
  41. Erving Goffman et al. 2002. The presentation of self in everyday life. 1959. Garden City, NY 259 (2002).
  42. Jens Grubert. 2023. Mixed Reality Interaction Techniques. In Springer Handbook of Augmented Reality. Springer, 109–129.
  43. Playing it real: magic lens and static peephole interfaces for games in a public space. In Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services. 231–240.
  44. Jens Grubert and Dieter Schmalstieg. 2013. Playing it real again: a repeated evaluation of magic lens and static peephole interfaces in public space. In Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services. 99–102.
  45. Rex Hartson and Pardha S Pyla. 2012. The UX Book: Process and guidelines for ensuring a quality user experience. Elsevier.
  46. Ubi Edge: Authoring Edge-Based Opportunistic Tangible User Interfaces in Augmented Reality. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 1–14.
  47. Steven J Henderson and Steven Feiner. 2008. Opportunistic controls: leveraging natural affordances as tangible user interfaces for augmented reality. In Proceedings of the 2008 ACM symposium on Virtual reality software and technology. 211–218.
  48. A design space for gaze interaction on head-mounted displays. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–12.
  49. Eye tracking: A comprehensive guide to methods and measures. OUP Oxford.
  50. Development of a communication support device controlled by eye movements and voluntary eye blink. IEICE transactions on information and systems 89, 6 (2006), 1790–1797.
  51. Designing a willing-to-use-in-public hand gestural interaction technique for smart glasses. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 4203–4215.
  52. Stream: Exploring the combination of spatially-aware tablets with augmented reality head-mounted displays for immersive analytics. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–14.
  53. Hiroshi Ishii and Brygg Ullmer. 1997. Tangible bits: towards seamless interfaces between people, bits and atoms. In Proceedings of the ACM SIGCHI Conference on Human factors in computing systems. 234–241.
  54. Robert JK Jacob. 1990. What you look at is what you get: eye movement-based interaction techniques. In Proceedings of the SIGCHI conference on Human factors in computing systems. 11–18.
  55. Robert JK Jacob. 1993. Eye movement-based human-computer interaction techniques: Toward non-command interfaces. Advances in human-computer interaction 4 (1993), 151–190.
  56. Reality-based interaction: a framework for post-WIMP interfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems. 201–210.
  57. Envisioning situated visualizations of environmental footprints in an urban environment. In VIS4Good-Visualization for Social Good workshop held as part of IEEE VIS 2022.
  58. Dimensions of situatedness for digital public displays. Advances in Human-Computer Interaction 2014 (2014).
  59. Jaeyeon Jung and Matthai Philipose. 2014. Courteous glass. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication.
  60. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: Adjunct publication. 1151–1160.
  61. Visar: Bringing interactivity to static data visualizations through augmented reality. arXiv preprint arXiv:1708.01377 (2017).
  62. Jesper Kjeldskov and Jeni Paay. 2012. A longitudinal review of Mobile HCI research methods. In Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services. https://doi.org/10.1145/2371574.2371586
  63. Jesper Kjeldskov and Mikael B Skov. 2014. Was it worth the hassle? Ten years of mobile HCI research discussions on lab and field evaluations. In Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services. 43–52.
  64. Social acceptability in HCI: A survey of methods, measures, and design strategies. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–19.
  65. Your smart glasses’ camera bothers me! exploring opt-in and opt-out gestures for privacy mediation. In Proceedings of the 10th Nordic Conference on Human-Computer Interaction. 473–481.
  66. (Un) Acceptable!?! Re-thinking the Social Acceptability of Emerging Technologies. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. 1–8.
  67. All about acceptability? Identifying factors for the adoption of data glasses. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 295–300.
  68. Don’t look at me that way! Understanding user attitudes towards data glasses usage. In Proceedings of the 17th international conference on human-computer interaction with mobile devices and services. 362–372.
  69. ” Everyone Is Talking about It!” A Distributed Approach to Urban Voting Technology and Visualisations. In Proceedings of the 33rd annual ACM conference on human factors in computing systems. 3127–3136.
  70. Smart objects as building blocks for the internet of things. IEEE internet computing 14, 1 (2009), 44–51.
  71. CleAR Sight: Exploring the Potential of Interacting with Transparent Tablets in Augmented Reality. In 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 196–205.
  72. Evaluating situated visualization in AR with eye tracking. In 2022 IEEE Evaluation and Beyond-Methodological Approaches for Visualization (BELIV). IEEE, 77–84.
  73. Marvis: Combining mobile devices and augmented reality for visual data analysis. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–17.
  74. Jean Lave. 1988. Cognition in practice: Mind, mathematics and culture in everyday life. Cambridge University Press.
  75. Research methods in human-computer interaction. Morgan Kaufmann.
  76. Reaching broader audiences with data visualization. IEEE Computer Graphics and Applications 40, 2 (2020), 82–90.
  77. Beyond mouse and keyboard: Expanding design considerations for information visualization interactions. IEEE Transactions on Visualization and Computer Graphics 18, 12 (2012), 2689–2698.
  78. Design Patterns for Situated Visualization in Augmented Reality. arXiv preprint arXiv:2307.09157 (2023).
  79. Post-WIMP interaction for information visualization. Foundations and Trends® in Human–Computer Interaction 14, 1 (2021), 1–95.
  80. Designing socially acceptable hand-to-face input. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology. 711–723.
  81. The quest for: Embedded visualization for augmenting basketball game viewing experiences. IEEE transactions on visualization and computer graphics 29, 1 (2022), 962–971.
  82. Towards an understanding of situated ar visualization for basketball free-throw training. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–13.
  83. Labeling out-of-view objects in immersive analytics to support situated visual searching. IEEE Transactions on Visualization and Computer Graphics (2021).
  84. Context-aware online adaptation of mixed reality interfaces. In Proceedings of the 32nd annual ACM symposium on user interface software and technology. 147–160.
  85. Glanceable ar: Evaluating information access methods for head-worn augmented reality. In 2020 IEEE conference on virtual reality and 3D user interfaces (VR). IEEE, 930–939.
  86. Andrés Lucero and Akos Vetek. 2014. NotifEye: using interactive glasses to deal with notifications while walking in public. In Proceedings of the 11th conference on advances in computer entertainment technology. 1–10.
  87. Andrew Vande Moere and Dan Hill. 2012. Designing for the situated and public visualization of urban data. Journal of Urban Technology 19, 2 (2012), 25–46.
  88. Would you do that? Understanding social acceptance of gestural interfaces. In Proceedings of the 12th international conference on Human computer interaction with mobile devices and services. 275–278.
  89. Mid-air pan-and-zoom on wall-sized displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 177–186.
  90. C Whan Park and Daniel C Smith. 1989. Product-level choice: a top-down or bottom-up process? Journal of Consumer Research 16, 3 (1989), 289–299.
  91. Designing for the eye: design parameters for dwell in gaze interaction. In Proceedings of the 24th Australian Computer-Human Interaction Conference. 479–488.
  92. Alex Poole and Linden J Ball. 2006. Eye tracking in HCI and usability research. In Encyclopedia of human computer interaction. IGI global, 211–219.
  93. The AT effect: how disability affects the perceived social acceptability of head-mounted display use. In proceedings of the 2016 CHI conference on human factors in computing systems. 4884–4895.
  94. Don’t mind me touching my wrist: a case study of interacting with on-body technology in public. In Proceedings of the 2013 International Symposium on Wearable Computers. 89–96.
  95. Corsican twin: Authoring in situ augmented reality visualisations in virtual reality. In Proceedings of the international conference on advanced visual interfaces. 1–9.
  96. Spatially-aware or spatially-agnostic? Elicitation and evaluation of user-defined cross-device interactions. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 3913–3922.
  97. Designing the spectator experience. In Proceedings of the SIGCHI conference on Human factors in computing systems. 741–750.
  98. Julie Rico and Stephen Brewster. 2009. Gestures all around us: user differences in social acceptability perceptions of gesture based interfaces. In Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services. 1–2.
  99. Julie Rico and Stephen Brewster. 2010. Usable gestures for mobile interfaces: evaluating social acceptability. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 887–896.
  100. Tap input as an embedded interaction method for mobile devices. In Proceedings of the 1st international conference on Tangible and embedded interaction. 263–270.
  101. Ubiquitous Interactions for Heads-Up Computing: Understanding Users’ Preferences for Subtle Interaction Techniques in Everyday Settings. In Proceedings of the 23rd International Conference on Mobile Human-Computer Interaction. 1–15.
  102. ProxSituated Visualization: An Extended Model of Situated Visualization using Proxies for Physical Referents. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 1–20.
  103. Augmented reality map navigation with freehand gestures. In 2019 IEEE conference on virtual reality and 3D user interfaces (VR). IEEE, 593–603.
  104. Active Proxy Dashboard: Binding Physical Referents and Abstract Data Representations in Situated Visualization through Tangible Interaction. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems. 1–7.
  105. Tangible globes for data visualisation in augmented reality. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–16.
  106. Bodystorming as embodied designing. interactions 17, 6 (2010), 47–51.
  107. Virtual reality on the go? a study on social acceptance of vr glasses. In Proceedings of the 20th international conference on human-computer interaction with mobile devices and services adjunct. 111–118.
  108. Exploring the use of hand-to-face input for interacting with head-worn displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 3181–3190.
  109. Tangible user interfaces: past, present, and future directions. Foundations and Trends® in Human–Computer Interaction 3, 1–2 (2010), 4–137.
  110. The Reality of the Situation: A Survey of Situated Analytics. IEEE Transactions on Visualization and Computer Graphics (2023).
  111. Ben Shneiderman. 1996. The eyes have it: A task by data type taxonomy for information visualizations. In Proceedings 1996 IEEE symposium on visual languages. IEEE, 336–343.
  112. Tangible views for information visualization. In ACM International Conference on Interactive Tabletops and Surfaces. 157–166.
  113. Geogate: Correlating geo-temporal datasets using an augmented reality space-time cube and tangible interactions. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 210–219.
  114. Situated Analytics. Immersive analytics 11190 (2018), 185–220.
  115. Exploring interactions with printed data visualizations in augmented reality. IEEE Transactions on Visualization and Computer Graphics (2022).
  116. Public visualization displays of citizen data: Design, impact and implications. International Journal of Human-Computer Studies 81 (2015), 4–16.
  117. StrikeAPose: revealing mid-air gestures on public displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 841–850.
  118. Sean White and Steven Feiner. 2009. SiteLens: situated visualization techniques for urban site visits. In Proceedings of the SIGCHI conference on human factors in computing systems. 1117–1120.
  119. Virtual vouchers: Prototyping a mobile augmented reality user interface for botanical species identification. In 3D User Interfaces (3DUI’06). IEEE, 119–126.
  120. Interaction and presentation techniques for shake menus in tangible augmented reality. In 2009 8th IEEE International Symposium on Mixed and Augmented Reality. IEEE, 39–48.
  121. Sean Michael White. 2009. Interaction and presentation techniques for situated visualization. Columbia University.
  122. Designing for mobile and immersive visual analytics in the field. IEEE transactions on visualization and computer graphics 26, 1 (2019), 503–513.
  123. Embedded data representations. IEEE transactions on visualization and computer graphics 23, 1 (2016), 461–470.
  124. Julie Rico Williamson. 2011. Send me bubbles: multimodal performance and social acceptability. In CHI’11 Extended Abstracts on Human Factors in Computing Systems. 899–904.
  125. Multimodal mobile interactions: usability studies in real world settings. In Proceedings of the 13th international conference on multimodal interfaces. 361–368.
  126. Planevr: Social acceptability of virtual reality for aeroplane passengers. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–14.
  127. MRTouch: Adding touch input to head-mounted mixed reality. IEEE transactions on visualization and computer graphics 24, 4 (2018), 1653–1660.
  128. ARShopping: In-Store Shopping Decision Support Through Augmented Reality and Immersive Visualization. In 2022 IEEE Visualization and Visual Analytics (VIS). IEEE, 120–124.
  129. Privatetalk: Activating voice input with hand-on-mouth gesture detected by bluetooth earphones. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. 1013–1020.
  130. ProObjAR: Prototyping Spatially-aware Interactions of Smart Objects with AR-HMD. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 1–15.
  131. Investigating the combination of adaptive uis and adaptable uis for improving usability and user performance of complex uis. International Journal of Human–Computer Interaction 36, 1 (2020), 82–94.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets