Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Subjective visualization experiences: impact of visual design and experimental design (2310.13713v1)

Published 13 Oct 2023 in cs.HC

Abstract: In contrast to objectively measurable aspects (such as accuracy, reading speed, or memorability), the subjective experience of visualizations has only recently gained importance, and we have less experience how to measure it. We explore how subjective experience is affected by chart design using multiple experimental methods. We measure the effects of changes in color, orientation, and source annotation on the perceived readability and trustworthiness of simple bar charts. Three different experimental designs (single image rating, forced choice comparison, and semi-structured interviews) provide similar but different results. We find that these subjective experiences are different from what prior work on objective dimensions would predict. Seemingly inconsequential choices, like orientation, have large effects for some methods, indicating that study design alters decision-making strategies. Next to insights into the effect of chart design, we provide methodological insights, such as a suggested need to carefully isolate individual elements in charts to study subjective experiences.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (85)
  1. When should you adjust standard errors for clustering? Technical Report. National Bureau of Economic Research.
  2. Declutter and focus: Empirically evaluating design guidelines for effective data communication. IEEE Transactions on Visualization and Computer Graphics 28, 10 (2021), 3351–3364.
  3. Sakinah S.J. Alhadad. 2018. Visualizing data to support judgement, inference, and decision making in learning analytics: Insights from cognitive psychology and visualization science. Journal of Learning Analytics 5, 2 (2018), 60–85.
  4. Visual expectations change subjective experience without changing performance. Consciousness and Cognition 71 (2019), 59–69. https://doi.org/10.1016/j.concog.2019.03.007
  5. Visual expectations change subjective experience without changing performance. Consciousness and Cognition 71 (2019), 59–69.
  6. A User Study of Visualization Effectiveness Using EEG and Cognitive Load. Comput. Graph. Forum 30, 3 (2011), 791–800. https://doi.org/10.1111/j.1467-8659.2011.01928.x
  7. Interpreting the effect of embellishment on chart visualizations. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–15.
  8. Monya Baker. 2016. Reproducibility crisis. Nature 533, 26 (2016), 353–66.
  9. Conjoint survey experiments. Advances in experimental political science 19 (2021), 19–41.
  10. Robert M Bell and Daniel F McCaffrey. 2002. Bias reduction in standard errors for linear regression with multi-stage samples. Survey Methodology 28, 2 (2002), 169–182.
  11. Jacques Bertin. 2010. Semiology of Graphics - Diagrams, Networks, Maps. ESRI. http://esripress.esri.com/display/index.cfm?fuseaction=display&websiteID=190&moduleID=0
  12. Crowdsourcing for Information Visualization: Promises and Pitfalls. In Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments - Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22-27, 2015, Revised Contributions (Lecture Notes in Computer Science, Vol. 10264), Daniel Archambault, Helen C. Purchase, and Tobias Hoßfeld (Eds.). Springer, 96–138. https://doi.org/10.1007/978-3-319-66435-4_5
  13. Information Visualization Evaluation Using Crowdsourcing. Comput. Graph. Forum 37, 3 (2018), 573–595. https://doi.org/10.1111/cgf.13444
  14. What Makes a Visualization Memorable? IEEE Trans. Vis. Comput. Graph. 19, 12 (2013), 2306–2315. https://doi.org/10.1109/TVCG.2013.234
  15. Virginia Braun and Victoria Clarke. 2023. Toward good practice in thematic analysis: Avoiding common problems and be (com) ing a knowing researcher. International Journal of Transgender Health 24, 1 (2023), 1–6.
  16. Matthew Brehmer and Tamara Munzner. 2013. A Multi-Level Typology of Abstract Visualization Tasks. IEEE Trans. Vis. Comput. Graph. 19, 12 (2013), 2376–2385. https://doi.org/10.1109/TVCG.2013.124
  17. Anders Bruun and Simon Ahm. 2015. Mind the gap! Comparing retrospective and concurrent ratings of emotion in user experience evaluation. In IFIP Conference on Human-Computer Interaction. Springer, 237–254.
  18. Alan Bryman. 2006. Integrating quantitative and qualitative research: how is it done? Qualitative research 6, 1 (2006), 97–113.
  19. Studying biases in visualization research: Framework and methods. Cognitive biases in visualizations (2018), 13–27.
  20. William S. Cleveland and Robert McGill. 1984. Graphical perception: Theory, experimentation, and application to the development of graphical methods. Journal of the American statistical association 79, 387 (1984), 531–554.
  21. Evaluating the Impact of User Characteristics and Different Layouts on an Interactive Visualization for Decision Making. Comput. Graph. Forum 33, 3 (2014), 371–380. https://doi.org/10.1111/cgf.12393
  22. On-line trust perception: What really matters. In 1st Workshop on Socio-Technical Aspects in Security and Trust, STAST 2011, Milan, Italy, September 8, 2011. IEEE, 52–59. https://doi.org/10.1109/STAST.2011.6059256
  23. Improving the External Validity of Conjoint Analysis: The Essential Role of Profile Distribution. Political Analysis 30, 1 (2022), 19–45. https://doi.org/10.1017/pan.2020.40
  24. The interplay of compassion, subjective happiness and proactive strategies on kindergarten teachers’ work engagement and perceived working environment fit. International journal of environmental research and public health 17, 13 (2020), 4869.
  25. A Design Methodology for Trust Cue Calibration in Cognitive Agents. In Virtual, Augmented and Mixed Reality. Designing and Developing Virtual and Augmented Environments - 6th International Conference, VAMR 2014, Held as Part of HCI International 2014, Heraklion, Crete, Greece, June 22-27, 2014, Proceedings, Part I (Lecture Notes in Computer Science, Vol. 8525), Randall Shumaker and Stephanie J. Lackey (Eds.). Springer, 251–262. https://doi.org/10.1007/978-3-319-07458-0_24
  26. A task-based taxonomy of cognitive biases for information visualization. IEEE transactions on visualization and computer graphics 26, 2 (2018), 1413–1432.
  27. Paul Dumouchel. 2005. Trust as an Action. European Journal of Sociology 46, 3 (2005), 417–428. https://doi.org/10.1017/S0003975605000160
  28. Struggling to make sense of it all: The emotional process of sensemaking following an extreme incident. Human Relations (2021), 00187267211059464.
  29. Cognitive Information Theories of Psychology and Applications with Visualization and HCI Through Crowdsourcing Platforms. In Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments - Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22-27, 2015, Revised Contributions (Lecture Notes in Computer Science, Vol. 10264), Daniel Archambault, Helen C. Purchase, and Tobias Hoßfeld (Eds.). Springer, 139–153. https://doi.org/10.1007/978-3-319-66435-4_6
  30. Naoki Egami and Kosuke Imai. 2018. Causal interaction in factorial experiments: Application to conjoint analysis. J. Amer. Statist. Assoc. (2018).
  31. Ioannis Evangelidis. 2023. Task sensitivity and noise: How mechanical properties of preference elicitation tasks account for differences in preferences across tasks. Decision (2023).
  32. Designing bar graphs: Orientation matters. Applied Cognitive Psychology: The Official Journal of the Society for Applied Research in Memory and Cognition 19, 7 (2005), 953–962.
  33. The science of visual data communication: What works. Psychological Science in the public interest 22, 3 (2021), 110–161.
  34. My Model is Unfair, Do People Even Care? Visual Design Affects Trust and Perceived Bias in Machine Learning. arXiv preprint arXiv:2308.03299 (2023).
  35. Esther Greussing and Hajo G Boomgaarden. 2019. Simply bells and whistles? Cognitive effects of visual aesthetics in digital longforms. Digital Journalism 7, 2 (2019), 273–293.
  36. Validating vignette and conjoint survey experiments against real-world behavior. Proceedings of the National Academy of Sciences 112, 8 (2015), 2395–2400. https://doi.org/10.1073/pnas.1416587112 arXiv:https://www.pnas.org/doi/pdf/10.1073/pnas.1416587112
  37. Causal inference in conjoint analysis: Understanding multidimensional choices via stated preference experiments. Political analysis 22, 1 (2014), 1–30.
  38. Steve Haroz and David Whitney. 2012. How Capacity Limits of Attention Influence Information Visualization Effectiveness. IEEE Trans. Vis. Comput. Graph. 18, 12 (2012), 2402–2410. https://doi.org/10.1109/TVCG.2012.233
  39. Christopher G. Healey. 1996. Choosing Effective Colours for Data Visualization. In 7th IEEE Visualization Conference, IEEE Vis 1996, San Francisco, CA, USA, October 27 - November 1, 1996, Proceedings, Roni Yagel and Gregory M. Nielson (Eds.). IEEE Computer Society and ACM, 263–270. https://doi.org/10.1109/VISUAL.1996.568118
  40. Jeffrey Heer and Michael Bostock. 2010. Crowdsourcing graphical perception: using Mechanical Turk to assess visualization design. In Proceedings of the 28th International Conference on Human Factors in Computing Systems, CHI 2010, Atlanta, Georgia, USA, April 10-15, 2010, Elizabeth D. Mynatt, Don Schoner, Geraldine Fitzpatrick, Scott E. Hudson, W. Keith Edwards, and Tom Rodden (Eds.). ACM, 203–212. https://doi.org/10.1145/1753326.1753357
  41. Jacob Hornik and Dan Zakay. 1996. Psychological time: The case of time and consumer behaviour. Time & Society 5, 3 (1996), 385–397.
  42. Ya-Hsin Hung and Paul Parsons. 2017a. Assessing User Engagement in Information Visualization. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, May 06-11, 2017, Extended Abstracts, Gloria Mark, Susan R. Fussell, Cliff Lampe, m. c. schraefel, Juan Pablo Hourcade, Caroline Appert, and Daniel Wigdor (Eds.). ACM, 1708–1717. https://doi.org/10.1145/3027063.3053113
  43. Ya-Hsin Hung and Paul Parsons. 2017b. Assessing user engagement in information visualization. In Proceedings of the 2017 CHI conference extended abstracts on human factors in computing systems. 1708–1717.
  44. A Systematic Review on the Practice of Evaluating Visualization. IEEE Trans. Vis. Comput. Graph. 19, 12 (2013), 2818–2827. https://doi.org/10.1109/TVCG.2013.126
  45. Chris Janiszewski and Tom Meyvis. 2001. Effects of brand logo complexity, repetition, and spacing on processing fluency and judgment. Journal of consumer research 28, 1 (2001), 18–32.
  46. Susan L. Joslyn and Jared E. LeClerc. 2016. Climate Projections and Uncertainty Communication. Top. Cogn. Sci. 8, 1 (2016), 222–241. https://doi.org/10.1111/tops.12177
  47. Elena Karahanna and Detmar W Straub. 1999. The psychological origins of perceived usefulness and ease-of-use. Information & management 35, 4 (1999), 237–250.
  48. Insight Beyond Numbers: The Impact of Qualitative Factors on Visual Data Analysis. IEEE Trans. Vis. Comput. Graph. 27, 2 (2021), 1011–1021. https://doi.org/10.1109/TVCG.2020.3030376
  49. Helen Kennedy and Rosemary Lucy Hill. 2018. The feeling of numbers: Emotions in everyday engagements with data and their visualisation. Sociology 52, 4 (2018), 830–848.
  50. Robert Kosara. 2016. An Empire Built On Sand: Reexamining What We Think We Know About Visualization. In Proceedings of the Sixth Workshop on Beyond Time and Errors on Novel Evaluation Methods for Visualization, BELIV 2016, Baltimore, MD, USA, October 24, 2016, Michael Sedlmair, Petra Isenberg, Tobias Isenberg, Narges Mahyar, and Heidi Lam (Eds.). ACM, 162–168. https://doi.org/10.1145/2993901.2993909
  51. Robert Kosara and Caroline Ziemkiewicz. 2010. Do Mechanical Turks dream of square pie charts?. In Proceedings of the 3rd BELIV’10 Workshop: BEyond time and errors: novel evaLuation methods for Information Visualization, Atlanta, GA, USA, April 10-11, 2010, Enrico Bertini, Heidi Lam, and Adam Perer (Eds.). ACM, 63–70. https://doi.org/10.1145/2110192.2110202
  52. Empirical studies in information visualization: Seven scenarios. IEEE transactions on visualization and computer graphics 18, 9 (2011), 1520–1536.
  53. How do People Make Sense of Unfamiliar Visualizations?: A Grounded Model of Novice’s Information Visualization Sensemaking. IEEE Trans. Vis. Comput. Graph. 22, 1 (2016), 499–508. https://doi.org/10.1109/TVCG.2015.2467195
  54. Huiyang Li and Nadine Moacdieh. 2014. Is “chart junk” useful? An extended examination of visual embellishment. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 58. Sage Publications Sage CA: Los Angeles, CA, 1516–1520.
  55. Communicating data: interactive infographics, scientific data and credibility. Journal of Science Communication 17, 2 (2018), A06.
  56. Selecting semantically-resonant colors for data visualization. In Computer Graphics Forum, Vol. 32. Wiley Online Library, 401–410.
  57. How weird is CHI?. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–14.
  58. Trust in Information Visualization. In 1st EuroVis Workshop on Trustworthy Visualization, TrustVis@EuroVis 2019, Porto, Portugal, June 3, 2019, Robert Kosara, Kai Lawonn, Lars Linsen, and Noeska N. Smit (Eds.). Eurographics Association, 25–29. https://doi.org/10.2312/trvis.20191187
  59. Can Anthropographics Promote Prosociality?A Review and Large-Sample Study. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (Yokohama, Japan) (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 611, 18 pages. https://doi.org/10.1145/3411764.3445637
  60. Tamara Munzner. 2014. Visualization Analysis and Design. A.K. Peters. http://www.cs.ubc.ca/%7Etmm/vadbook/
  61. OECD. 2017. OECD guidelines on measuring trust. OECD Publishing.
  62. Decision making with visualizations: a cognitive framework across disciplines. Cognitive research: principles and implications 3, 1 (2018), 1–25.
  63. Praveen Paritosh. 2012. Human Computation Must Be Reproducible. In Proceedings of the First International Workshop on Crowdsourcing Web Search, Lyon, France, April 17, 2012 (CEUR Workshop Proceedings, Vol. 842), Ricardo Baeza-Yates, Stefano Ceri, Piero Fraternali, and Fausto Giunchiglia (Eds.). CEUR-WS.org, 20–25. http://ceur-ws.org/Vol-842/crowdsearch-paritosh.pdf
  64. Paul Parsons. 2021. Understanding data visualization design practice. IEEE Transactions on Visualization and Computer Graphics 28, 1 (2021), 665–675.
  65. David Peebles and Nadia Ali. 2015. Expert interpretation of bar and line graphs: The role of graphicacy in reducing the effect of graph format. Frontiers in psychology 6 (2015), 1673.
  66. Comparison of subjective perception with objective measurement of olfaction. Otolaryngology—Head and Neck Surgery 134, 3 (2006), 488–490.
  67. James E Pustejovsky and Elizabeth Tipton. 2018. Small-sample methods for cluster-robust variance estimation and hypothesis testing in fixed effects models. Journal of Business & Economic Statistics 36, 4 (2018), 672–683.
  68. Ghulam Jilani Quadri and Paul Rosen. 2021. A Survey of Perception-Based Visualization Studies by Task. CoRR abs/2107.07477 (2021). arXiv:2107.07477 https://arxiv.org/abs/2107.07477
  69. Typicality effect in data graphs. Visual Communication (2022), 14703572221130445.
  70. Preference versus performance: Investigating the dissociation between objective measures and subjective ratings of usability for schematic metro maps and intuitive theories of design. International Journal of Human-Computer Studies 98 (2017), 109–128.
  71. Beyond usability and performance: A review of user experience-focused evaluations in visualization. In Proceedings of the Sixth Workshop on Beyond Time and Errors on Novel Evaluation Methods for Visualization. 133–142.
  72. Beyond Usability and Performance: A Review of User Experience-focused Evaluations in Visualization. In Proceedings of the Sixth Workshop on Beyond Time and Errors on Novel Evaluation Methods for Visualization, BELIV 2016, Baltimore, MD, USA, October 24, 2016, Michael Sedlmair, Petra Isenberg, Tobias Isenberg, Narges Mahyar, and Heidi Lam (Eds.). ACM, 133–142. https://doi.org/10.1145/2993901.2993903
  73. Andrew K Schnackenberg and Edward C Tomlinson. 2016. Organizational transparency: A new perspective on managing trust in organization-stakeholder relationships. Journal of management 42, 7 (2016), 1784–1810.
  74. Vidya Setlur and Bridget Cogley. 2022. Functional Aesthetics for Data Visualization. John Wiley & Sons.
  75. David W. Sprague and Melanie Tory. 2012. Exploring how and why people use visualizations in casual contexts: Modeling user goals and regulated motivations. Inf. Vis. 11, 2 (2012), 106–123. https://doi.org/10.1177/1473871611433710
  76. Contingent weighting in judgment and choice. Psychological review 95, 3 (1988), 371.
  77. Christian Unkelbach and Rainer Greifeneder. 2013. A general model of fluency effects in judgment and decision making. In The experience of thinking. Psychology Press, 21–42.
  78. Frank Van Ham and Bernice Rogowitz. 2008. Perceptual organization in user-generated graph layouts. IEEE Transactions on Visualization and Computer Graphics 14, 6 (2008), 1333–1339.
  79. Consuming information from sources perceived as biased versus untrustworthy: Parallel and distinct influences. Journal of the Association for Consumer Research 5, 2 (2020), 137–148.
  80. An emotional response to the value of visualization. IEEE computer graphics and applications 39, 5 (2019), 8–17.
  81. Examining the components of trust in map-based visualizations. In 1st EuroVis Workshop on Trustworthy Visualization, TrustVis 2019. The Eurographics Association, 19–23.
  82. Understand users’ comprehension and preferences for composing information visualizations. ACM Trans. Comput. Hum. Interact. 21, 1 (2014), 6:1–6:30. https://doi.org/10.1145/2541288
  83. Yei-Yu Yeh and Christopher D Wickens. 1988. Dissociation of performance and subjective measures of workload. Human factors 30, 1 (1988), 111–120.
  84. Scalability of Network Visualisation from a Cognitive Load Perspective. IEEE Trans. Vis. Comput. Graph. 27, 2 (2021), 1677–1687. https://doi.org/10.1109/TVCG.2020.3030459
  85. Mapping the Landscape of COVID-19 Crisis Visualizations. In CHI ’21: CHI Conference on Human Factors in Computing Systems, Virtual Event / Yokohama, Japan, May 8-13, 2021, Yoshifumi Kitamura, Aaron Quigley, Katherine Isbister, Takeo Igarashi, Pernille Bjørn, and Steven Mark Drucker (Eds.). ACM, 608:1–608:23. https://doi.org/10.1145/3411764.3445381
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Laura Koesten (21 papers)
  2. Drew Dimmery (13 papers)
  3. Michael Gleicher (44 papers)
  4. Torsten Möller (29 papers)
Citations (1)