Towards Multimodal Exploratory Data Analysis: SoniScope as a Prototypical Implementation
dc.contributor.author | Enge, Kajetan | en_US |
dc.contributor.author | Rind, Alexander | en_US |
dc.contributor.author | Iber, Michael | en_US |
dc.contributor.author | Höldrich, Robert | en_US |
dc.contributor.author | Aigner, Wolfgang | en_US |
dc.contributor.editor | Agus, Marco | en_US |
dc.contributor.editor | Aigner, Wolfgang | en_US |
dc.contributor.editor | Hoellt, Thomas | en_US |
dc.date.accessioned | 2022-06-02T15:50:45Z | |
dc.date.available | 2022-06-02T15:50:45Z | |
dc.date.issued | 2022 | |
dc.description.abstract | The metaphor of auscultating with a stethoscope can be an inspiration to combine visualization and sonification for exploratory data analysis. This paper presents SoniScope, a multimodal approach and its prototypical implementation based on this metaphor. It combines a scatterplot with an interactive parameter mapping sonification, thereby conveying additional information about items that were selected with a visual lens. SoniScope explores several design options for the shape of its lens and the sorting of the selected items for subsequent sonification. Furthermore, the open-source prototype serves as a blueprint framework for how to combine D3.js visualization and SuperCollider sonification in the Jupyter notebook environment. | en_US |
dc.description.sectionheaders | Visual Analysis and Machine Learning | |
dc.description.seriesinformation | EuroVis 2022 - Short Papers | |
dc.identifier.doi | 10.2312/evs.20221095 | |
dc.identifier.isbn | 978-3-03868-184-7 | |
dc.identifier.pages | 67-71 | |
dc.identifier.pages | 5 pages | |
dc.identifier.uri | https://doi.org/10.2312/evs.20221095 | |
dc.identifier.uri | https://diglib.eg.org:443/handle/10.2312/evs20221095 | |
dc.publisher | The Eurographics Association | en_US |
dc.rights | Attribution 4.0 International License | |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | |
dc.subject | CCS Concepts: Human-centered computing --> Visualization systems and tools; Auditory feedback; Sound-based input / output | |
dc.subject | Human centered computing | |
dc.subject | Visualization systems and tools | |
dc.subject | Auditory feedback | |
dc.subject | Sound | |
dc.subject | based input / output | |
dc.title | Towards Multimodal Exploratory Data Analysis: SoniScope as a Prototypical Implementation | en_US |