40-Issue 6
Permanent URI for this collection
Browse
Browsing 40-Issue 6 by Subject "cloth modelling"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Estimating Garment Patterns from Static Scan Data(© 2021 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd, 2021) Bang, Seungbae; Korosteleva, Maria; Lee, Sung‐Hee; Benes, Bedrich and Hauser, HelwigThe acquisition of highly detailed static 3D scan data for people in clothing is becoming widely available. Since 3D scan data is given as a single mesh without semantic separation, in order to animate the data, it is necessary to model shape and deformation behaviour of individual body and garment parts. This paper presents a new method for generating simulation‐ready garment models from 3D static scan data of clothed humans. A key contribution of our method is a novel approach to segmenting garments by finding optimal boundaries between the skin and garment. Our boundary‐based garment segmentation method allows for stable and smooth separation of garments by using an implicit representation of the boundary and its optimization strategy. In addition, we present a novel framework to construct a 2D pattern from the segmented garment and place it around the body for a draping simulation. The effectiveness of our method is validated by generating garment patterns for a number of scan data.Item Fashion Transfer: Dressing 3D Characters from Stylized Fashion Sketches(© 2021 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd, 2021) Fondevilla, Amelie; Rohmer, Damien; Hahmann, Stefanie; Bousseau, Adrien; Cani, Marie‐Paule; Benes, Bedrich and Hauser, HelwigFashion design often starts with hand‐drawn, expressive sketches that communicate the essence of a garment over idealized human bodies. We propose an approach to automatically dress virtual characters from such input, previously complemented with user‐annotations. In contrast to prior work requiring users to draw garments with accurate proportions over each virtual character to be dressed, our method follows a style transfer strategy : the information extracted from a single, annotated fashion sketch can be used to inform the synthesis of one to many new garment(s) with similar style, yet different proportions. In particular, we define the style of a loose garment from its silhouette and folds, which we extract from the drawing. Key to our method is our strategy to extract both shape and repetitive patterns of folds from the 2D input. As our results show, each input sketch can be used to dress a variety of characters of different morphologies, from virtual humans to cartoon‐style characters.