Browsing by Author "Chen, Bing-Yu"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item ClipFlip : Multi‐view Clipart Design(© 2021 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd, 2021) Shen, I‐Chao; Liu, Kuan‐Hung; Su, Li‐Wen; Wu, Yu‐Ting; Chen, Bing‐Yu; Benes, Bedrich and Hauser, HelwigWe present an assistive system for clipart design by providing visual scaffolds from the unseen viewpoints. Inspired by the artists' creation process, our system constructs the visual scaffold by first synthesizing the reference 3D shape of the input clipart and rendering it from the desired viewpoint. The critical challenge of constructing this visual scaffold is to generate a reference 3D shape that matches the user's expectations in terms of object sizing and positioning while preserving the geometric style of the input clipart. To address this challenge, we propose a user‐assisted curve extrusion method to obtain the reference 3D shape. We render the synthesized reference 3D shape with a consistent style into the visual scaffold. By following the generated visual scaffold, the users can efficiently design clipart with their desired viewpoints. The user study conducted by an intuitive user interface and our generated visual scaffold suggests that our system is especially useful for estimating the ratio and scale between object parts and can save on average 57% of drawing time.Item EvIcon: Designing High‐Usability Icon with Human‐in‐the‐loop Exploration and IconCLIP(© 2023 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd., 2023) Shen, I‐Chao; Cherng, Fu‐Yin; Igarashi, Takeo; Lin, Wen‐Chieh; Chen, Bing‐Yu; Hauser, Helwig and Alliez, PierreInterface icons are prevalent in various digital applications. Due to limited time and budgets, many designers rely on informal evaluation, which often results in poor usability icons. In this paper, we propose a unique human‐in‐the‐loop framework that allows our target users, that is novice and professional user interface (UI) designers, to improve the usability of interface icons efficiently. We formulate several usability criteria into a perceptual usability function and enable users to iteratively revise an icon set with an interactive design tool, EvIcon. We take a large‐scale pre‐trained joint image‐text embedding (CLIP) and fine‐tune it to embed icon visuals with icon tags in the same embedding space (IconCLIP). During the revision process, our design tool provides two types of instant perceptual usability feedback. First, we provide perceptual usability feedback modelled by deep learning models trained on IconCLIP embeddings and crowdsourced perceptual ratings. Second, we use the embedding space of IconCLIP to assist users in improving icons' visual distinguishability among icons within the user‐prepared icon set. To provide the perceptual prediction, we compiled , the first large‐scale dataset of perceptual usability ratings over 10,000 interface icons, by conducting a crowdsourcing study. We demonstrated that our framework could benefit UI designers' interface icon revision process with a wide range of professional experience. Moreover, the interface icons designed using our framework achieved better semantic distance and familiarity, verified by an additional online user study.