2000 character limit reached
SOMson -- Sonification of Multidimensional Data in Kohonen Maps (2404.00016v2)
Published 15 Mar 2024 in cs.HC and cs.LG
Abstract: Kohonen Maps, aka. Self-organizing maps (SOMs) are neural networks that visualize a high-dimensional feature space on a low-dimensional map. While SOMs are an excellent tool for data examination and exploration, they inherently cause a loss of detail. Visualizations of the underlying data do not integrate well and, therefore, fail to provide an overall picture. Consequently, we suggest SOMson, an interactive sonification of the underlying data, as a data augmentation technique. The sonification increases the amount of information provided simultaneously by the SOM. Instead of a user study, we present an interactive online example, so readers can explore SOMson themselves. Its strengths, weaknesses, and prospects are discussed.
- A. Ultsch, “Self-organizing neural networks for visualisation and classification,” in Information and Classification, O. Opitz, B. Lausen, and R. Klar, Eds. Berlin, Heidelberg: Springer, 1993, pp. 307–313. https://doi.org/10.1007/978-3-642-50974-2_31
- E. A. Fernandez and M. Balzarini, “Improving cluster visualization in self-organizing maps: Application in gene expression data analysis,” Computers in Biology and Medicine, vol. 37, no. 12, pp. 1677–1689, 2007. https://doi.org/10.1016/j.compbiomed.2007.04.003
- H. S. L. P. Qiu, “Jsom: Jointly-evolving self-organizing maps for alignment of biological datasets and identification of related clusters,” PLOS Computational Biology, vol. 17, no. 3, p. e1008804, Mar. 2021. https://doi.org/10.1371/journal.pcbi.1008804
- S.-B. Lee, Y. Choe, T.-S. Chon, and H. Y. Kang, “Analysis of zebrafish (danio rerio) behavior in response to bacterial infection using a self-organizing map,” BMC Veterinary Research, vol. 11, no. 1, 2015. https://doi.org/10.1186/s12917-015-0579-2
- N. Xu, W. Zhu, R. Wang, Q. Li, Z. Wang, and R. B. Finkelman, “Application of self-organizing maps to coal elemental data,” International Journal of Coal Geology, vol. 277, p. 104358, 2023. https://doi.org/10.1016/j.coal.2023.104358
- R. Bader, A. Zielke, and J. Franke, “Timbre-based machine learning of clustering chinese and western hip hop music,” in Audio Engineering Society Convention 150, May 2021, p. 10473.
- T. Ziemer, “Goniometers are a powerful acoustic feature for music information retrieval tasks,” in DAGA 2023 – 49. Jahrestagung für Akustik, Hamburg, Germany, Mar. 2023, pp. 934–937. https://pub.dega-akustik.de/DAGA_2023/data/articles/000600.pdf
- Željko D. Vlaović, B. L. Stepanov, A. S. Anđelković, V. M. Rajs, Z. M. Čepić, and M. A. Tomić, “Mapping energy sustainability using the kohonen self-organizing maps - case study,” Journal of Cleaner Production, vol. 412, p. 137351, 2023. https://doi.org/10.1016/j.jclepro.2023.137351
- M. Blaß and R. Bader, “Content-based music retrieval and visualization system for ethnomusicological music archives,” in Computational Phonogram Archiving, R. Bader, Ed. Cham: Springer International Publishing, 2019, pp. 145–173. https://doi.org/10.1007/978-3-030-02695-0_7
- R. Bader, M. Blaß, and J. Franke, “Computational timbre and tonal system similarity analysis of the music of northern myanmar-based kachin compared to xinjiang-based uyghurethnic groups,” arXiv, 2021. https://doi.org/10.48550/arXiv.2103.08203
- F. Aquistapace, D. Castillo-Castro, R. I. González, N. Amigo, G. García Vidable, D. R. Tramontina, F. J. Valencia, and E. M. Bringa, “Plasticity in diamond nanoparticles: dislocations and amorphization during loading and dislocation multiplication during unloading,” Journal of Materials Science, 2023. https://doi.org/10.1007/s10853-023-09223-7
- J. Qian, N. P. Nguyen, Y. Oya, G. Kikugawa, T. Okabe, Y. Huang, and F. S. Ohuchi, “Introducing self-organized maps (som) as a visualization tool for materials research and education,” Results in Materials, vol. 4, p. 100020, 2019.
- T. Ziemer, P. Kiattipadungkul, and T. Karuchit, “Acoustic features from the recording studio for music information retrieval tasks,” Proceedings of Meetings on Acoustics, vol. 42, no. 1, p. 035004, 2020. https://doi.org/10.1121/2.0001363
- T. Ziemer and H. Schultheis, “PAMPAS: A PsychoAcoustical Method for the Perceptual Analysis of multidimensional Sonification,” Frontiers in Neuroscience, vol. 16, 2022. https://doi.org/10.3389/fnins.2022.930944
- S. Ferguson, W. Martens, and D. Cabrera, “Statistical sonification for exploratory data analysis,” in The Sonification Handbook, T. Hermann, A. Hunt, and J. G. Neuhoff, Eds. Berlin: COST & Logos, 2011, pp. 175–196. https://sonification.de/handbook/chapters/chapter8/
- E. S. Yeung, “Pattern recognition by audio representation of multivariate analytical data,” Anal. Chem., vol. 52, pp. 1120–1123, 1980. https://doi.org/10.1021/ac50057a028
- S. Barrass and V. Best, “Stream-based sonification diagrams,” in 14th International Conference on Auditory Display (ICAD2008), Paris, Jun 2008. http://hdl.handle.net/1853/49945
- D. Black, J. A. Issawi, C. Hansen, C. Rieder, and H. Hahn, “Auditory support for navigated radiofrequency ablation,” in CURAC — 12. Jahrestagung der Deutschen Gesellschaft für Computer- und Roboter Assistierte Chirurgie, W. Freysinger, Ed., Innsbruck, Nov 2013, pp. 30–33. https://www.curac.org/images/stories/Jahrestagung2013/Tagungsband/Proceedings%20CURAC%202013.pdf
- K. Groß-Vogt and C. J. Rieder, “A-e-i-o-u — tiltification demo for ICAD2023,” in Proceedings of the 28th International Conference on Auditory Display (ICAD2023), Norrköping, Sweden, June 2023.
- P. Kuchenbecker, “Voice balance — a spirit level based on vocal sounds,” in Proceedings of the 28th International Conference on Auditory Display (ICAD2023), Norrköping, Sweden, June 2023.
- J. Niestroj, “Level assistant — a sonification-based spirit level app,” in Proceedings of the 28th International Conference on Auditory Display (ICAD2023), Norrköping, Sweden, June 2023.
- S. Barrass, “Soniclevel-pobblebonk app for the ICAD 2023 sonic tilt competition,” in Proceedings of the 28th International Conference on Auditory Display (ICAD2023), Norrköping, Sweden, June 2023.
- T. Ziemer, N. Nuchprayoon, and H. Schultheis, “Psychoacoustic sonification as user interface for human-machine interaction,” International Journal of Informatics Society, vol. 12, no. 1, pp. 3–16, 2020. http://www.infsoc.org/journal/vol12/12-1
- T. Ziemer and H. Schultheis, “A psychoacoustic auditory display for navigation,” in 24th International Conference on Auditory Displays (ICAD2018), Houghton, MI, USA, June 2018, pp. 136–144. http://doi.org/10.21785/icad2018.007
- T. Ziemer and H. Schultheis, “Psychoacoustical signal processing for three-dimensional sonification,” in 25th International Conference on Auditory Displays (ICAD2019), Newcastle Upon Tyne, UK, June 2019, pp. 277–284. https://doi.org/10.21785/icad2019.018
- T. Ziemer and H. Schultheis, “Three orthogonal dimensions for psychoacoustic sonification,” ArXiv Preprint, 2019. https://doi.org/10.48550/arXiv.1912.00766
- T. Ziemer, “Three-dimensional sonification as a surgical guidance tool,” J Multimodal User Interfaces, vol. 17, no. 4, pp. 253–262, 2023. https://doi.org/10.1007/s12193-023-00422-9
- M. Asendorf, M. Kienzle, R. Ringe, F. Ahmadi, D. Bhowmik, J. Chen, K. Hyunh, S. Kleinert, J. Kruesilp, X. Wang, Y. Y. Lin, W. Luo, N. Mirzayousef Jadid, A. Awadin, V. Raval, E. E. S. Schade, H. Jaman, K. Sharma, C. Weber, H. Winkler, and T. Ziemer, “Tiltification/sonic-tilt: First release of sonic tilt,” in https://github.com/Tiltification/sonic-tilt, 2021. https:/doi.org/10.5281/zenodo.5543983
- T. Ziemer and H. Schultheis, “The CURAT sonification game: Gamification for remote sonification evaluation,” in 26th International Conference on Auditory Display (ICAD2021), Virtual conference, June 2021, pp. 233–240. https://doi.org/10.21785/icad2021.026
- M. Asendorf, M. Kienzle, R. Ringe, F. Ahmadi, D. Bhowmik, J. Chen, K. Huynh, S. Kleinert, J. Kruesilp, Y. Lee, X. Wang, W. Luo, N. Jadid, A. Awadin, V. Raval, E. Schade, H. Jaman, K. Sharma, C. Weber, H. Winkler, and T. Ziemer, “Tiltification — an accessible app to popularize sonification,” in Proc. 26th International Conference on Auditory Display (ICAD2021), Virtual Conference, June 2021, pp. 184–191. https://doi.org/10.21785/icad2021.025
- T. Ziemer, “Visualization vs. sonification to level a table,” in Proceedings of ISon 2022, 7th Interactive Sonification Workshop, Delmenhorst, Germany, 2022.
- The Processing Foundation, “p5.js-sound,” in https://github.com/processing/p5.js-sound, 2024.
- N. Rönnberg, “Sonification supports perception of brightness contrast,” Journal on Multimodal User Interfaces, vol. 13, no. 4, pp. 373–381, 2019. https://doi.org/10.1007/s12193-019-00311-0
- R. Ponmalai and C. Kamath, “Self-organizing maps and their applications to data analysis,” Lawrence Livermore National Lab.(LLNL), Livermore, CA (United States), Tech. Rep., 2019.
- T. Ziemer, “Sound terminology in sonification,” AES: Journal of the Audio Engineering Society, vol. 72, no. 5, 2024. https://doi.org/10.17743/jaes.2022.0133
- H. Lindetorp and K. Falkenberg, “Sonification for everyone everywhere: Evaluating the webaudioxml sonification toolkit for browsers,” in 26th International Conference on Auditory Display (ICAD 2021), Virtual Conference, June 2021, pp. 15–21. https://doi.org/10.21785/icad2021.009
- D. Reinsch and T. Hermann, “Interacting with sonifications: The mesonic framework for interactive auditory data science,” in Proceedings of the 7th Interactive Sonification Workshop, N. Rönnberg, S. Lenzi, T. Ziemer, T. Hermann, and R. Bresin, Eds., Delmenhorst, Germany, Sept. 2022, pp. 65–74. https://doi.org/10.5281/zenodo.7552242
- D. Reinsch and T. Hermann, “sonecules: a python sonification architecture,” in 28th International Conference on Auditory Display (ICAD2023), Norrköping, Sweden, 2023, pp. 62–69. https://doi.org/10.21785/icad2023.5580
- J. W. Trayford and C. M. Harrison, “Introducing strauss: A flexible sonification python package,” in 28th International Conference on Auditory Display (ICAD 2023), Linköping, Sweden, June 2023, pp. 249–256. https://doi.org/10.21785/icad2023.1978