Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Open Your Ears and Take a Look: A State-of-the-Art Report on the Integration of Sonification and Visualization (2402.16558v2)

Published 26 Feb 2024 in cs.HC, cs.GR, cs.SD, and eess.AS

Abstract: The research communities studying visualization and sonification for data display and analysis share exceptionally similar goals, essentially making data of any kind interpretable to humans. One community does so by using visual representations of data, and the other community employs auditory (non-speech) representations of data. While the two communities have a lot in common, they developed mostly in parallel over the course of the last few decades. With this STAR, we discuss a collection of work that bridges the borders of the two communities, hence a collection of work that aims to integrate the two techniques into one form of audiovisual display, which we argue to be "more than the sum of the two." We introduce and motivate a classification system applicable to such audiovisual displays and categorize a corpus of 57 academic publications that appeared between 2011 and 2023 in categories such as reading level, dataset type, or evaluation system, to mention a few. The corpus also enables a meta-analysis of the field, including regularly occurring design patterns such as type of visualization and sonification techniques, or the use of visual and auditory channels, showing an overall diverse field with different designs. An analysis of a co-author network of the field shows individual teams without many interconnections. The body of work covered in this STAR also relates to three adjacent topics: audiovisual monitoring, accessibility, and audiovisual data art. These three topics are discussed individually in addition to the systematically conducted part of this research. The findings of this report may be used by researchers from both fields to understand the potentials and challenges of such integrated designs while hopefully inspiring them to collaborate with experts from the respective other field.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (160)
  1. Andreopoulou A., Goudarzi V.: Reflections on the representation of women in the International Conferences on Auditory Displays (ICAD). In Proceedings of the 23rd International Conference on Auditory Display (ICAD 2017) (Pennsylvania State University, June 2017), Georgia Institute of Technology, pp. 43–48. doi:10.21785/icad2017.031.
  2. Audry E., Garcia J.: Congruent audio-visual alarms for supervision tasks. In Proceedings of the 25th International Conference on Auditory Display (ICAD 2019) (Newcastle upon Tyne, June 2019), Department of Computer and Information Sciences, Northumbria University, pp. 7–11. doi:10.21785/icad2019.022.
  3. Andreopoulou A., Goudarzi V.: Sonification first: The role of icad in the advancement of sonification-related research. In Proceedings of the 26th International Conference on Auditory Display (June 2021), Georgia Institute of Technology, pp. 65–73. URL: http://hdl.handle.net/1853/66335.
  4. Sonification to support the monitoring tasks of security operations centres. IEEE Transactions on Dependable and Secure Computing 18, 3 (2021), 1227–1244. doi:10.1109/TDSC.2019.2931557.
  5. Sonifying stochastic walks on biomolecular energy landscapes. In Proc. 24th International Conference on Auditory Display (ICAD 2018) (Michigan Technological University, June 2018), Georgia Institute of Technology, pp. 232–239. doi:10.21785/icad2018.032.
  6. Adhitya S., Kuuskankare M.: The sonified urban masterplan (sum) tool: Sonification for urban planning and design. In Proc. 17th International Conference on Auditory Display (ICAD-2011) (Budapest, Hungary, June 2011). http://hdl.handle.net/1853/51918.
  7. Augmenting critical care patient monitoring using wearable technology: Review of usability and human factors. JMIR Human Factors 8, 2 (2021), e16491. doi:10.2196/16491.
  8. Curve shape and curvature perception through interactive sonification. ACM Trans. Appl. Percept. 9, 4 (Oct. 2012). doi:10.1145/2355598.2355600.
  9. Ballora M.: Two examples of sonification for viewer engagement: Hurricanes and squirrel hibernation cycles. In Proc. 21st International Conference on Auditory Display (ICAD-2015) (Graz, Austria, July 2015), Georgia Institute of Technology, pp. 300–301. http://hdl.handle.net/1853/54172.
  10. Barrass S.: Auditory Information Design. PhD thesis, Australian National University, Canberra, 1997.
  11. Barrass S.: The aesthetic turn in sonification towards a social and cultural medium. AI & Society 27, 2 (May 2012), 177–181. doi:10.1007/s00146-011-0335-5.
  12. Berger M., Bill R.: Combining vr visualization and sonification for immersive exploration of urban noise standards. Multimodal Technologies and Interaction 3, 2 (May 2019), 34:1–34:15. doi:10.3390/MTI3020034.
  13. Interactive sonification for structural biology and structure-based drug design. In Proceedings of ISon 2016, 5th Interactive Sonification Workshop, CITEC, Bielefeld University (Bielefeld, Germany, 2016).
  14. Bearman N.: Using sound to represent uncertainty in future climate projections for the united kingdom. In Proc. 17th International Conference on Auditory Display (ICAD-2011) (Budapest, Hungary, June 2011). http://hdl.handle.net/1853/51922.
  15. Bertin J.: Semiology of Graphics: Diagrams Networks Maps. University of Wisconsin, Madison, 1983. Originally published in 1967 in French.
  16. Bearman N., Fisher P. F.: Using sound to represent spatial data in arcgis. Computers & Geosciences 46 (Sept. 2012), 157–163. doi:10.1016/j.cageo.2011.12.001.
  17. Brehmer M., Munzner T.: A multi-level typology of abstract visualization tasks. IEEE Trans. Visualization and Computer Graphics 19, 12 (2013), 2376–2385. doi:10.1109/TVCG.2013.124.
  18. Bouchara T., Montès M.: Immersive sonification of protein surface. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) (Mar. 2020), pp. 380–383. doi:10.1109/VRW50115.2020.00082.
  19. Between fact and fabrication: How visual art might nurture environmental consciousness. Frontiers in Psychology 13 (2022). doi:10.3389/fpsyg.2022.925843.
  20. Proceedings of the 2006 AVI workshop on BEyond time and errors: novel evaluation methods for information visualization. ACM, 2006. doi:10.1145/1168149.
  21. Line Harp: Importance-driven sonification for dense line charts. In Proceedings of IEEE Visualization and Visual Analytics (VIS) 2023 – Short Papers (2023), pp. 186–190. arXiv:2307.16589, doi:10.1109/VIS54172.2023.00046.
  22. Chabot S., Braasch J.: An immersive virtual environment for congruent audio-visual spatialized data sonifications. In Proc. 23rd International Conference on Auditory Display (ICAD–2017) (Pennsylvania State University, June 2017), Georgia Institute of Technology, pp. 203–210. doi:10.21785/icad2017.072.
  23. A review of guidance approaches in visual data analysis: A multifocal perspective. Computer Graphics Forum 38, 3 (2019), 861–879. doi:10.1111/cgf.13730.
  24. Audiovisual sonifications: A design map for multisensory integration in data representation. In DRS2022: Bilbao (2022). doi:10.21606/drs.2022.380.
  25. Readings in Information Visualization: Using Vision to Think. Morgan Kaufmann, San Francisco, 1999.
  26. Highcharts sonification studio: An online, open-source, extensible, and accessible data sonification tool. In Proc. 26th International Conference on Auditory Display (ICAD 2021) (Virtual Conference, June 2021), Georgia Institute of Technology, pp. 210–216. doi:10.21785/icad2021.005.
  27. Dubus G., Bresin R.: A Systematic Review of Mapping Strategies for the Sonification of Physical Quantities. PLoS ONE 8, 12 (2013), e82491. doi:10.1371/journal.pone.0082491.
  28. de Campo A.: Toward a data sonification design space map. In Proceedings of the 13th International Conference on Auditory Display (2007), pp. 342–347. URL: http://hdl.handle.net/1853/50042.
  29. Exploring the role of sound in augmenting visualization to enhance user engagement. In 2018 IEEE Pacific Visualization Symposium (PacificVis) (Kobe, Apr. 2018), IEEE, pp. 225–229. doi:10.1109/PacificVis.2018.00036.
  30. Sonouno web: An innovative user centred web interface. In HCI International 2022 – Late Breaking Posters (Cham, 2022), Stephanidis C., Antona M., Ntoa S., Salvendy G., (Eds.), Communications in Computer and Information Science, Springer Nature Switzerland, pp. 628–633. doi:10.1007/978-3-031-19679-9_79.
  31. SonAir: the design of a sonification of radar data for air traffic control. Journal on Multimodal User Interfaces 17, 3 (2023), 137–149. doi:10.1007/s12193-023-00404-x.
  32. Openspace sonification: Complementing visualization of the solar system with sound. In Proc. 26th International Conference on Auditory Display (ICAD 2021) (Virtual Conference, June 2021), International Community for Auditory Display, pp. 135–142. doi:10.21785/icad2021.018.
  33. Towards multimodal exploratory data analysis: Soniscope as a prototypical implementation. In Proc. 24th Eurographics Conference on Visualization (EuroVis 2022) - Short Papers (Rome, 2022), The Eurographics Association, pp. 67–71. doi:10.2312/evs.20221095.
  34. Towards Multimodal Exploratory Data Analysis: SoniScope as a Prototypical Implementation. In EuroVis 2022 - Short Papers (2022), Agus M., Aigner W., Hoellt T., (Eds.), The Eurographics Association, pp. 67–71. doi:10.2312/evs.20221095.
  35. Towards a unified terminology for sonification and visualization. Journal on Personal and Ubiquitous Computing 27 (2023), 1949–1963. doi:10.1007/s00779-023-01720-5.
  36. Navigation of interactive sonifications and visualisations of time-series data using multi-touch computing. Journal on Multimodal User Interfaces 5, 3 (May 2012), 97–109. doi:10.1007/s12193-011-0075-3.
  37. Forensic Architecture, Eno B.: Digital violence, 2021. Webpage: https://sonification.design/#DigitalViolence. Last accessed: Dec 21st 2023.
  38. The accessibility of data visualizations on the web for screen reader users: Practices and experiences during covid-19. ACM Transactions on Accessible Computing 16, 1 (2023), 1–29.
  39. Fitzpatrick J., Neff F.: Stream segregation: Utilizing harmonic variance in auditory graphs. In Proc. 15th Sound and Music Computing Conference (SMC2018) (Limassol, Cyprus, 2018), Zenodo, pp. 52–59. doi:10.5281/zenodo.1422501.
  40. Foo B.: Too Blue, 2015. Webpage: https://sonification.design/#TooBlue. Last accessed: Dec 21st 2023.
  41. The science of visual data communication: What works. Psychological Science in the public interest 22, 3 (2021), 110–161.
  42. Friendly M.: A Brief History of Data Visualization. In Handbook of Data Visualization, Chen C.-h., Härdle W., Unwin A., (Eds.), Springer Handbooks Comp.Statistics. Springer, Berlin, Heidelberg, 2008, pp. 15–56. doi:10.1007/978-3-540-33037-0_2.
  43. Frysinger S. P.: A brief history of auditory data representation to the 1980s. In Proceedings of the International Conference on Auditory Display (2005). URL: http://hdl.handle.net/1853/50089.
  44. Graphically hearing: Enhancing understanding of geospatial data through an integrated auditory and visual experience. IEEE Computer Graphics and Applications 38, 4 (July 2018), 18–26. doi:10.1109/MCG.2018.042731655.
  45. Grond F., Hermann T.: Singing function: Exploring auditory graphs with a vowel based sonification. Journal on Multimodal User Interfaces 5, 3-4 (May 2012), 87–95. doi:10.1007/s12193-011-0068-2.
  46. Klima|Anlage—performing climate data. In Addressing the Challenges in Communicating Climate Change Across Various Audiences, Leal Filho W., Lackner B., McGhie H., (Eds.). Springer, Cham, 2019, pp. 339–355. doi:10.1007/978-3-319-98294-6_21.
  47. Sound of databases: Sonification of a semantic web database engine. Proc. VLDB Endow. 14, 12 (July 2021), 2695–2698. doi:10.14778/3476311.3476322.
  48. Goudarzi V.: Systematic Procedure to Develop Sonifications. PhD thesis, University of Music and Performing Arts Graz, 2017.
  49. Gomez I., Ramirez R.: A data sonification approach to cognitive state identification. In Proc. 17th International Conference on Auditory Display (ICAD-2011) (Budapest, Hungary, June 2011). http://hdl.handle.net/1853/51569.
  50. The triple tone sonification method to enhance the diagnosis of alzheimer’s dementia. In Proc. 22nd International Conference on Auditory Display (ICAD–2016) (Canberra, Australia, July 2016). doi:10.21785/icad2016.023.
  51. Combining sonification and visualization for the analysis of process execution data. In 2016 IEEE 18th Conference on Business Informatics (CBI) (Aug. 2016), vol. 02, pp. 32–37. doi:10.1109/CBI.2016.47.
  52. Benefitting InfoVis with visual difficulties. IEEE Transactions on Visualization and Computer Graphics 17, 12 (Dec. 2011), 2213–2222. doi:10.1109/TVCG.2011.175.
  53. Seismic sound lab: Sights, sounds and perception of the earth as an acoustic space. In Sound, Music, and Motion (Cham, 2014), Aramaki M., Derrien O., Kronland-Martinet R., Ystad S., (Eds.), Lecture Notes in Computer Science, Springer International Publishing, pp. 161–174. doi:10.1007/978-3-319-12976-1_10.
  54. Hermann T.: Sonification for Exploratory Data Analysis. PhD thesis, Bielefeld University, Bielefeld, Germany, 02 2002.
  55. Hermann T.: Taxonomy and definitions for sonification and auditory display. In Proceedings of the 14th International Conference on Auditory Display (2008). URL: http://hdl.handle.net/1853/49960.
  56. Herrmann V.: Visualizing and sonifying how an artificial ear hears music. In Proceedings of the NeurIPS 2019 Competition and Demonstration Track (Dec. 2020), Escalante H. J., Hadsell R., (Eds.), vol. 123 of Proceedings of Machine Learning Research, PMLR, pp. 192–202. https://proceedings.mlr.press/v123/herrmann20a.html.
  57. Accessible data representation with natural sound. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (2023), pp. 826:1–826:19. doi:10.1145/3544548.3581087.
  58. Infosonics: Accessible infographics for people who are blind using sonification and voice. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (2022), pp. 480:1–480:13. doi:10.1145/3491102.3517465.
  59. Hunt A., Hermann T.: The importance of interaction in sonification. In Proc. of the 10th Meeting of the International Conference on Auditory Display (ICAD 2004) (Sydney, Australia, 2004), pp. 1–8.
  60. Auditory support for situation awareness in video surveillance. In Proceedings of the 18th International Conference on Auditory Display, (Atlanta, GA, USA,, June 2012), Georgia Institute of Technology, pp. 156–163. URL: http://hdl.handle.net/1853/44426.
  61. The Sonification Handbook. Logos, Bielefeld, 2011.
  62. Continuous sonification enhances adequacy of interactions in peripheral process monitoring. International Journal of Human-Computer Studies 95 (2016), 54–65. doi:10.1016/j.ijhcs.2016.06.002.
  63. Han Y. C., Khanduja A.: The future is red: Visualizing wildfire predictions using contactless interaction. In Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems (New York, NY, USA, 2022), CHI EA ’22, Association for Computing Machinery. doi:10.1145/3491101.3519903.
  64. The sonified hertzsprung-russell diagram. In Proc. 28th International Conference on Auditory Display (ICAD 2023) (Norrköping, Sweden, June 2023), Georgia Institute of Technology, pp. 272–279. doi:10.21785/icad2023.6263.
  65. Hermann T., Reinsch D.: Sc3nb: A Python-SuperCollider Interface for Auditory Data Science. In Proceedings of the 16th International Audio Mostly Conference (New York, NY, USA, Oct. 2021), AM ’21, Association for Computing Machinery, pp. 208–215. doi:10.1145/3478384.3478401.
  66. Hildebrandt T., Rinderle-Ma S.: Server sounds and network noises. In Cognitive Infocommunications (CogInfoCom), 2015 6th IEEE International Conference on (2015), IEEE, pp. 45–50.
  67. Audio universe: Tour of the solar system. Astronomy & Geophysics 63, 2 (Apr. 2022), 2.38–2.40. doi:10.1093/astrogeo/atac027.
  68. Hunter J. D.: Matplotlib: A 2d graphics environment. Computing in Science & Engineering 9, 3 (2007), 90–95. doi:10.1109/MCSE.2007.55.
  69. VA + Embeddings STAR: A State-of-the-Art Report on the Use of Embeddings in Visual Analytics. Computer Graphics Forum 42, 3 (2023), 539–571. doi:10.1111/cgf.14859.
  70. Iber M.: Auditory display in workspace environments. In Foundations in Sound Design for Embedded Media: A Multidisciplinary Approach, Filimowicz M., (Ed.). Routledge, 2020, pp. 131–154.
  71. A systematic review on the practice of evaluating visualization. IEEE Trans. Visualization and Computer Graphics 19, 12 (2013), 2818–2827. doi:10.1109/TVCG.2013.126.
  72. A user study of auditory, head-up and multi-modal displays in vehicles. Applied Ergonomics 46 (2015), 184–192. doi:10.1016/j.apergo.2014.08.008.
  73. Spatialized anonymous audio for browsing sensor networks via virtual worlds. In Proc. 19th International Conference on Auditory Display (ICAD2013) (Lodz, Poland, July 2013), Georgia Institute of Technology, pp. 67–75. http://hdl.handle.net/1853/51643.
  74. Johannsen G.: Auditory displays in human-machine interfaces. Proceedings of the IEEE 92, 4 (2004), 742–758. doi:10.1109/JPROC.2004.825905.
  75. Auralization of three-dimensional cellular automata. In Artificial Intelligence in Music, Sound, Art and Design (Cham, 2021), Romero J., Martins T., Rodríguez-Fernández N., (Eds.), Lecture Notes in Computer Science, Springer International Publishing, pp. 161–170. doi:10.1007/978-3-030-72914-1_11.
  76. Sonifying data from the human microbiota: Biota beats. Computer Music Journal 44, 1 (Mar. 2020), 51–70. doi:10.1162/comj_a_00552.
  77. Sonification – an alternative presentation of the electrocardiogram: A systematic literature review. In 2019 IEEE XXVIII International Scientific Conference Electronics (ET) (2019), IEEE, pp. 1–4. doi:10.1109/ET.2019.8878650.
  78. Web sonification sandbox - an easy-to-use web application for sonifying data and equations. In Proc. Web Audio Conference WAC-2017 (London, UK, Aug. 2017). https://qmro.qmul.ac.uk/xmlui/handle/123456789/26083.
  79. Deep sync, 2023. Webpage: https://ars.electronica.art/futurelab/en/projects-deep-sync/. Last accessed: Dec 28st 2023.
  80. Kramer G. (Ed.): Auditory Display: Sonification, Audification and Auditory Interfaces. Proceedings of the First International Conference on Auditory Display (ICAD) 1992. Addison-Wesley, Reading, Mass, 1994.
  81. Sonification Report: Status of the Field and Research Agenda. Report for the NSF, International Community for Auditory Display, 1999. URL: http://www.icad.org/websiteV2.0/References/nsf.html.
  82. Visualization and sonification of long-term epilepsy electroencephalogram monitoring. Journal of Medical and Biological Engineering 38, 6 (2028), 943–952. doi:10.1007/s40846-017-0358-6.
  83. Lenzi S.: The Design of Data Sonification. Design Processes, Protocols and Tools Grounded in Anomaly Detection. PhD thesis, Politecnico di Milano, 2021. https://www.politesi.polimi.it/handle/10589/177079.
  84. Lindetorp H., Falkenberg K.: Sonification for everyone everywhere: Evaluating the webaudioxml sonification toolkit for browsers. In Proc. 26th International Conference on Auditory Display (ICAD 2021) (Virtual Conference, June 2021), Georgia Institute of Technology, pp. 15–21. http://hdl.handle.net/1853/66351.
  85. Climate data sonification and visualization: An analysis of topics, aesthetics, and characteristics in 32 recent projects. Frontiers in Psychology 13 (2023). doi:10.3389/fpsyg.2022.1020102.
  86. Aiive: Interactive visualization and sonification of neural networks in virtual reality. In 2021 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR) (Nov. 2021), pp. 251–255. doi:10.1109/AIVR52153.2021.00057.
  87. Mapping in the emergency: Designing a hyperlocal and socially conscious sonified map of covid-19 in suffolk county, new york. In Proceedings of ISon 2022, 7th Interactive Sonification Workshop, BSCC, University of Bremen (Jan. 2023), Zenodo. doi:10.5281/ZENODO.7552257.
  88. Supporting interruption management and multimodal interface design: Three meta-analyses of task performance as a function of interrupting task modality. Human Factors 55, 4 (2013), 697–724. doi:10.1177/0018720813476298.
  89. Visual-auditory representation and analysis of molecular scalar fields. In Proc. EUROGRAPHICS 2019 (2019), pp. 25–26. doi:10.2312/egp.20191051.
  90. Visual-auditory volume rendering of scalar fields. In Proc. 25th International Conference on Auditory Display (ICAD 2019) (Northumbria University, June 2019), vol. 147–154, Georgia Institute of Technology. http://hdl.handle.net/1853/61517.
  91. Sonification of reference markers for auditory graphs: Effects on non-visual point estimation tasks. PeerJ Computer Science 2 (2016), e51.
  92. McNabb L., Laramee R. S.: Survey of Surveys (SoS) - Mapping The Landscape of Survey Papers in Information Visualization. Computer Graphics Forum 36, 3 (2017), 589–617. doi:10.1111/cgf.13212.
  93. Reducing cognitive load by mixing auditory and visual presentation modes. Journal of educational psychology 87, 2 (1995), 319.
  94. Consumption as a rhythm: A multimodal experiment on the representation of time-series. In 2018 22nd International Conference Information Visualisation (IV) (July 2018), pp. 504–509. doi:10.1109/iV.2018.00093.
  95. Collaborative study of interactive seismic array sonification for data exploration and public outreach activities. In Proceedings of ISon 2016, 5th Interactive Sonification Workshop, CITEC, Bielefeld University (Bielefeld, Germany, 2016).
  96. An investigation of the effect of immersive visual and auditory feedback on rhythmic walking interaction. In Proceedings of the Audio Mostly 2016 (2016), AM ’16, Association for Computing Machinery, pp. 194–201. doi:10.1145/2986416.2986429.
  97. Data-driven sonification of cfd aneurysm models. In Proc. 24th International Conference on Auditory Display (ICAD 2018) (Houghton, Michigan, June 2018), The International Community for Auditory Display, pp. 28–33. doi:10.21785/icad2018.010.
  98. Munzner T.: Visualization Analysis and Design. CRC Press, Boca Raton, Jan. 2015.
  99. From data to knowledge – visualizations as transformation processes within the data-information-knowledge continuum. In 2010 14th International Conference Information Visualisation (July 2010), pp. 445–449. doi:10.1109/IV.2010.68.
  100. Nees M. A.: Eight components of a design theory of sonification. In Proceedings of the 25th International Conference on Auditory Display (ICAD 2019) (2019), pp. 176–183. doi:10.21785/icad2019.048.
  101. Neuhoff J. G. (Ed.): Ecological Psychoacoustics. Elsevier Academic Press, San Diego, 2004.
  102. Neuhoff J. G.: Is sonification doomed to fail. In Proceedings of the 25th International Conference on Auditory Display (ICAD 2019) (2019), pp. 327–330.
  103. Prototype auditory displays for a fuel efficiency driver interface. In Proceedings of the 20th International Conference on Auditory Display (New York, USA, June 2014), Georgia Institute of Technology. Publisher: Georgia Institute of Technology. URL: http://hdl.handle.net/1853/52089.
  104. Sonophenology. Journal on Multimodal User Interfaces 5, 3 (May 2012), 123–129. doi:10.1007/s12193-011-0066-4.
  105. Understanding git history: A multi-sense view. In Proceedings of the 8th International Workshop on Social Software Engineering (New York, NY, USA, 2016), SSE 2016, Association for Computing Machinery, pp. 1–7. doi:10.1145/2993283.2993285.
  106. Nees M. A., Walker B. N.: Encoding and representation of information in auditory graphs: Descriptive reports of listener strategies for understanding data. In Proceedings of the International Conference on Auditory Display (Paris, FR, June 2008). URL: http://hdl.handle.net/1853/49907.
  107. Augmenting the navigation of complex data sets using sonification: A case study with brainx3. In 2015 IEEE 2nd VR Workshop on Sonic Interactions for Virtual Environments (SIVE) (Mar. 2015), pp. 1–6. doi:10.1109/SIVE.2015.7361284.
  108. Sonification of large datasets in a 3d immersive environment: A neuroscience case study. In Proc. ACHI 2014 : The Seventh International Conference on Advances in Computer-Human Interactions (Jan. 2014), IARIA, pp. 35–40.
  109. Phillips S., Cabrera A.: Sonification workstation. In 25th International Conference on Auditory Display (ICAD 2019) (Northumbria University, June 2019), Georgia Institute of Technology, pp. 184–190. http://hdl.handle.net/1853/61529.
  110. Siren: Creative and extensible sonification on the web. In Proc. 28th International Conference on Auditory Display (ICAD 2023) (Norrköping, Sweden, June 2023), Georgia Institute of Technology, pp. 78–84. https://hdl.handle.net/1853/72879.
  111. Combining audio and visual displays to highlight temporal and spatial seismic patterns. Journal on Multimodal User Interfaces 16, 1 (Mar. 2022), 125–142. doi:10.1007/s12193-021-00378-8.
  112. Polli A.: Heat and the Heartbeat of the City: Sonifying Data Describing Climate Change. Leonardo Music Journal 16 (Dec. 2006), 44–45. doi:10.1162/lmj.2006.16.44.
  113. Rönnberg N., Forsell C.: Questionnaires assessing usability of audio-visual representations. In Proc. AVI 2022 Workshop on Audio-Visual Analytics (Rome, 2022). doi:10.5281/zenodo.6555676.
  114. Enhancing visualization of molecular simulations using sonification. In 2015 IEEE 1st International Workshop on Virtual and Augmented Reality for Molecular Science (VARMS@IEEEVR) (Mar. 2015), pp. 25–30. doi:10.1109/VARMS.2015.7151725.
  115. Exploring sonification for augmenting brain scan data. In Proc. 19th International Conference on Auditory Display (ICAD2013) (Lodz, Poland, July 2013), Georgia Institute of Technology, pp. 95–101. http://hdl.handle.net/1853/51653.
  116. Reinsch D., Hermann T.: Interacting with sonifications: The mesonic framework for interactive auditory data science. In Proceedings of the 7th Interactive Sonification Workshop (ISon) (2022), CITEC, Bielefeld University, p. 65.
  117. Reinsch D., Hermann T.: Sonecules: A Python sonification architecture. In Proceedings of the 28th International Conference on Auditory Display (Norrköping, Sweden, June 2023), Georgia Institute of Technology, pp. 62–69. doi:10.21785/icad2023.5580.
  118. Riber A. G.: Planethesizer: Approaching exoplanet sonification. In Proc. 24th International Conference on AuditoryDisplay (ICAD 2018) (Michigan Technological University, June 2018), Georgia Institute of Technology, pp. 219–226. http://hdl.handle.net/1853/60073.
  119. Riber A. G.: Sonifigrapher: Sonified light curve synthesizer. In Proc. 25th International Conference on Auditory Display (ICAD 2019) (Northumbria University, June 2019), Georgia Institute of Technology, pp. 62–66. http://hdl.handle.net/1853/61497.
  120. Rönnberg N., Johansson J.: Interactive sonification for visual dense data displays. In Proc. 5th Interactive Sonification Workshop, ISon (2016), CITEC, Bielefeld University, pp. 63–67.
  121. Anesthesia personnel’s visual attention regarding patient monitoring in simulated non-critical and critical situations, an eye-tracking study. BMC Anesthesiology 22, 1 (2022), 167:1–167:10. doi:10.1186/s12871-022-01705-6.
  122. Rönnberg N.: Musical sonification supports visual discrimination of color intensity. Behaviour & Information Technology 38, 10 (2019), 1028–1037. doi:10.1080/0144929X.2019.1657952.
  123. Rönnberg N.: Sonification supports perception of brightness contrast. Journal on Multimodal User Interfaces 13, 4 (2019), 373–381. doi:10.1007/s12193-019-00311-0.
  124. Rönnberg N.: Sonification for conveying data and emotion. In Proceedings of the 16th International Audio Mostly Conference (New York, NY, USA, 2021), AM ’21, Association for Computing Machinery, pp. 56–63. doi:10.1145/3478384.3478387.
  125. Russo M., Santaguida A.: 5000 exoplanets: Listen to the sounds of discovery. In Proc. 27th International Conference on Auditory Display (ICAD 2022) (Virtual Conference, June 2022), Georgia Institute of Technology, pp. 64–68. http://hdl.handle.net/1853/67384.
  126. An exploratory use of audiovisual displays on oceanographic data. In Proc. AVI 2022 Workshop on Audio-Visual Analytics (WAVA22) (Frascati, Italy, 2022), Zenodo. doi:10.5281/zenodo.6555839.
  127. The design of an auditory alarm concept for a paper mill control room. In Advances in Ergonomics In Design, Usability & Special Populations: Part III (2022), vol. 20, AHFE Open Acces. ISSN: 27710718 Issue: 20. doi:10.54941/ahfe1001301.
  128. Shneiderman B.: The eyes have it: A task by data type taxonomy for information visualizations. In Proceedings of the IEEE Symposium on Visual Languages, VL (1996), pp. 336–343. doi:10.1109/VL.1996.545307.
  129. A review on the relationship between sound and movement in sports and rehabilitation. Frontiers in Psychology 10 (2019). doi:10.3389/fpsyg.2019.00244.
  130. Design study methodology: Reflections from the trenches and the stacks. IEEE Trans. Visualization and Computer Graphics 18, 12 (2012), 2431–2440. doi:10.1109/TVCG.2012.213.
  131. Head-worn displays for healthcare and industry workers: A review of applications and design. International Journal of Human-Computer Studies 154 (2021), 102628. doi:10.1016/j.ijhcs.2021.102628.
  132. Auditory graphs: A summary of current experience and towards a research agenda. In International Conference on Auditory Display (ICAD2005), Limerick, Ireland (2005).
  133. A design space of visualization tasks. IEEE Trans. Visualization and Computer Graphics 19, 12 (Dec. 2013), 2366–2375. doi:10.1109/TVCG.2013.120.
  134. Design actions for the design of visualization onboarding methods. In EduVis Workshop, IEEE VIS (Melbourne, Australia, 2023). doi:10.31219/osf.io/wjp5x.
  135. Sonification of a 3-D spider web and reconstitution for musical composition using granular synthesis. Computer Music Journal 44, 4 (Dec. 2020), 43–59. doi:10.1162/comj_a_00580.
  136. Sebjanič R., Šutić G.: Aqua_forensic, 2018. Webpage: https://ars.electronica.art/aeblog/en/2018/11/07/aquaforensic/. Last accessed: Dec 21st 2023.
  137. Supper A.: Lobbying for the Ear : The Public Fascination with and Academic Legitimacy of the Sonification of Scientific Data. Doctoral thesis, Maastricht University, 2012. doi:10.26481/dis.20120606as.
  138. Visualization onboarding grounded in educational theories. In Visualization Psychology. Springer Nature, 2023. doi:10.1007/978-3-031-34738-2.
  139. Traver P., Bergh E.: Harmonices solaris - sonification of the planets. In Proc. 28th International Conference on Auditory Display (ICAD 2023) (Norrköping, Sweden, June 2023), Georgia Institute of Technology, pp. 242–248. doi:10.21785/icad2023.204.
  140. Sonification of in-vehicle interface reduces gaze movements under dual-task condition. Applied Ergonomics 50 (2015), 41–49. doi:10.1016/j.apergo.2015.02.004.
  141. Perceptually-motivated sonification of spatiotemporally-dynamic CFD data. In Proc. 26th International Conference on Auditory Display (ICAD 2021) (Virtual Conference, June 2021), Georgia Institute of Technology, pp. 202–209. http://hdl.handle.net/1853/66343.
  142. Vickers P., Hogg B.: Sonification abstraite/sonification concrète: An ’æsthetic persepctive space’ for classifying auditory displays in the ars musica domain. In Proceedings of the 12th International Conference on Auditory Display (London, UK, June 2006), Georgia Institute of Technology, pp. 210–216. URL: http://hdl.handle.net/1853/50641.
  143. Vickers P.: Sonification and music, music and sonification. In The Routledge Companion to Sounding Art, Cobussen M., Meelberg V., Truax B., (Eds.). Routledge, 2016, pp. 135–144.
  144. Sonification approaches in sports in the past decade: a literature review. In Proceedings of the 15th International Audio Mostly Conference (2020), ACM, pp. 199–205. doi:10.1145/3411109.3411126.
  145. Walker B. N., Lane D. M.: Sonification mappings database on the web. In Proceedings of the 2001 International Conference on Auditory Display (Espoo, Finland, 2001), Georgia Institute of Technology, p. 281.
  146. Walker B. N., Mauney L. M.: Universal design of auditory graphs: A comparison of sonification mappings for visually impaired and sighted listeners. ACM Transactions on Accessible Computing (TACCESS) 2, 3 (2010), 1–16.
  147. Interactive wearable systems for upper body rehabilitation: a systematic review. Journal of NeuroEngineering and Rehabilitation 14, 1 (2017), 20. doi:10.1186/s12984-017-0229-y.
  148. Wohlin C.: Guidelines for snowballing in systematic literature studies and a replication in software engineering. In Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering (London England United Kingdom, May 2014), ACM, pp. 1–10. doi:10.1145/2601248.2601268.
  149. Worrall D.: Polymedia design for network metadata monitoring. In Sonification Design: From Data to Intelligible Soundfields, Worrall D., (Ed.), Human–Computer Interaction Series. Springer International Publishing, 2019, pp. 253–273. doi:10.1007/978-3-030-01497-1_9.
  150. Worrall D.: Sonification Design: From Data to Intelligible Soundfields. Human–Computer Interaction Series. Springer, Cham, 2019. doi:10.1007/978-3-030-01497-1.
  151. Dermoscopy diagnosis of cancerous lesions utilizing dual deep learning algorithms via visual and audio (sonification) outputs: Laboratory and prospective observational studies. EBioMedicine 40 (2019), 176–183. doi:10.1016/j.ebiom.2019.01.028.
  152. Watson M., Sanderson P.: Designing for attention with sound: Challenges and extensions to ecological interface design. Human Factors 49, 2 (2007), 331–346. doi:10.1518/001872007X312531.
  153. Winters R. M., Weinberg G.: Sonification of the tohoku earthquake: Music, popularization & the auditory sublime. In Proc. The 21th International Conference on Auditory Display (ICAD–2015 (July 2015), Georgia Institute of Technology, pp. 273–280. http://hdl.handle.net/1853/54149.
  154. Effectiveness of multimodal display in navigation situation. In The Ninth International Symposium of Chinese CHI (2022), Chinese CHI 2021, Association for Computing Machinery, pp. 50–62. doi:10.1145/3490355.3490361.
  155. Yang J., Hermann T.: Mode explorer: Using model-based sonification to investigate basins of attraction. In Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences (New York, NY, USA, 2017), AM ’17, Association for Computing Machinery. doi:10.1145/3123514.3123525.
  156. Yang J., Hermann T.: Interactive mode explorer sonification enhances exploratory cluster analysis. Journal of the Audio Engineering Society 66, 9 (Sept. 2018), 703–711. doi:10.17743/jaes.2018.0042.
  157. Toward a deeper understanding of the role of interaction in information visualization. IEEE Trans. Visualization and Computer Graphics 13, 6 (2007), 1224–1231. doi:10.1109/TVCG.2007.70515.
  158. Ylmaz S.: Orbuculum, 2023. Ars Electronica Festival 2023. Webpage: https://ars.electronica.art/who-owns-the-truth/de/orbuculum/. Last accessed: Dec 28th 2023.
  159. Ziemer T.: Three-dimensional sonification for image-guided surgery. In Proceedings of the 28th International Conference on Auditory Display (Norrköping, Sweden, June 2023), Georgia Institute of Technology, pp. 8–14. doi:10.21785/icad2023.2324.
  160. Measuring cognitive load using eye tracking technology in visual computing. In Proceedings of the sixth workshop on beyond time and errors on novel evaluation methods for visualization (2016), ACM, pp. 78–85. doi:10.1145/2993901.2993908.
Citations (5)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com