Browsing by Author "Yu, Lingyun"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item DARC: A Visual Analytics System for Multivariate Applicant Data Aggregation, Reasoning and Comparison(The Eurographics Association, 2022) Hou, Yihan; Liu, Yu; Wang, He; Zhang, Zhichao; Li, Yue; Liang, Hai-Ning; Yu, Lingyun; Yang, Yin; Parakkat, Amal D.; Deng, Bailin; Noh, Seung-TakPeople often make decisions based on their comprehensive understanding of various materials, judgement of reasons, and comparison among choices. For instance, when hiring committees review multivariate applicant data, they need to consider and compare different aspects of the applicants' materials. However, the amount and complexity of multivariate data increase the difficulty to analyze the data, extract the most salient information, and then rapidly form opinions based on the extracted information. Thus, a fast and comprehensive understanding of multivariate data sets is a pressing need in many fields, such as business and education. In this work, we had in-depth interviews with stakeholders and characterized user requirements involved in data-driven decision making in reviewing school applications. Based on these requirements, we propose DARC, a visual analytics system for facilitating decision making on multivariate applicant data. Through the system, users are supported to gain insights of the multivariate data, picture an overview of all data cases, and retrieve original data in a quick and intuitive manner. The effectiveness of DARC is validated through observational user evaluations and interviews.Item Hybrid Touch/Tangible Spatial 3D Data Selection(The Eurographics Association and John Wiley & Sons Ltd., 2019) Besançon, Lonni; Sereno, Mickael; Yu, Lingyun; Ammi, Mehdi; Isenberg, Tobias; Gleicher, Michael and Viola, Ivan and Leitte, HeikeWe discuss spatial selection techniques for three-dimensional datasets. Such 3D spatial selection is fundamental to exploratory data analysis. While 2D selection is efficient for datasets with explicit shapes and structures, it is less efficient for data without such properties. We first propose a new taxonomy of 3D selection techniques, focusing on the amount of control the user has to define the selection volume. We then describe the 3D spatial selection technique Tangible Brush, which gives manual control over the final selection volume. It combines 2D touch with 6-DOF 3D tangible input to allow users to perform 3D selections in volumetric data. We use touch input to draw a 2D lasso, extruding it to a 3D selection volume based on the motion of a tangible, spatially-aware tablet. We describe our approach and present its quantitative and qualitative comparison to state-of-the-art structure-dependent selection. Our results show that, in addition to being dataset-independent, Tangible Brush is more accurate than existing dataset-dependent techniques, thus providing a trade-off between precision and effort.Item The State of the Art of Spatial Interfaces for 3D Visualization(© 2021 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd, 2021) Besançon, Lonni; Ynnerman, Anders; Keefe, Daniel F.; Yu, Lingyun; Isenberg, Tobias; Benes, Bedrich and Hauser, HelwigWe survey the state of the art of spatial interfaces for 3D visualization. Interaction techniques are crucial to data visualization processes and the visualization research community has been calling for more research on interaction for years. Yet, research papers focusing on interaction techniques, in particular for 3D visualization purposes, are not always published in visualization venues, sometimes making it challenging to synthesize the latest interaction and visualization results. We therefore introduce a taxonomy of interaction technique for 3D visualization. The taxonomy is organized along two axes: the primary source of input on the one hand and the visualization task they support on the other hand. Surveying the state of the art allows us to highlight specific challenges and missed opportunities for research in 3D visualization. In particular, we call for additional research in: (1) controlling 3D visualization widgets to help scientists better understand their data, (2) 3D interaction techniques for dissemination, which are under‐explored yet show great promise for helping museum and science centers in their mission to share recent knowledge, and (3) developing new measures that move beyond traditional time and errors metrics for evaluating visualizations that include spatial interaction.