37-Issue 2
Permanent URI for this collection
Browse
Browsing 37-Issue 2 by Subject "Animation"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Aura Mesh: Motion Retargeting to Preserve the Spatial Relationships between Skinned Characters(The Eurographics Association and John Wiley & Sons Ltd., 2018) Jin, Taeil; Kim, Meekyoung; Lee, Sung-Hee; Gutierrez, Diego and Sheffer, AllaApplying motion-capture data to multi-person interaction between virtual characters is challenging because one needs to preserve the interaction semantics while also satisfying the general requirements of motion retargeting, such as preventing penetration and preserving naturalness. An efficient means of representing interaction semantics is by defining the spatial relationships between the body parts of characters. However, existing methods consider only the character skeleton and thus are not suitable for capturing skin-level spatial relationships. This paper proposes a novel method for retargeting interaction motions with respect to character skins. Specifically, we introduce the aura mesh, which is a volumetric mesh that surrounds a character's skin. The spatial relationships between two characters are computed from the overlap of the skin mesh of one character and the aura mesh of the other, and then the interaction motion retargeting is achieved by preserving the spatial relationships as much as possible while satisfying other constraints. We show the effectiveness of our method through a number of experiments.Item Motion Sickness Simulation Based on Sensorimotor Control(The Eurographics Association and John Wiley & Sons Ltd., 2018) Hu, Chen-Hui; Lin, Wen-Chieh; Gutierrez, Diego and Sheffer, AllaSensorimotor control is an essential mechanism for human motions, from involuntary reflex actions to intentional motor skill learning, such as walking, jumping, and swimming. Humans perform various motions according to different task goals and physiological sensory perception; however, most existing computational approaches for motion simulation and generation rarely consider the effects of human perception. The assumption of perfect perception (i.e., no sensory errors) of existing approaches restricts the generated motion types and makes dynamical reactions less realistic. We propose a general framework for sensorimotor control, integrating a balance controller and a vestibular model, to generate perception-aware motions. By exploiting simulated perception, more natural responses that are closer to human reactions can be generated. For example, motion sickness caused by the impairments in the function of the vestibular system induces postural instability and body sway. Our approach generates physically correct motions and reasonable reactions to external stimuli since the spatial orientation estimation by the vestibular system is essential to preserve balance. We evaluate our framework by demonstrating standing balance on a rotational platform with different angular speeds and duration. The generated motions show that either faster angular speeds or longer rotational duration cause more severe motion sickness. Our results demonstrate that sensorimotor control, integrating human perception and physically-based control, offers considerable potential for providing more human-like behaviors, especially for perceptual illusions of human beings, including visual, proprioceptive, and tactile sensations.Item Real-time Locomotion Controller using an Inverted-Pendulum-based Abstract Model(The Eurographics Association and John Wiley & Sons Ltd., 2018) Hwang, Jaepyung; Kim, Jongmin; Suh, Il Hong; Kwon, Taesoo; Gutierrez, Diego and Sheffer, AllaIn this paper, we propose a novel motion controller for the online generation of natural character locomotion that adapts to new situations such as changing user control or applying external forces. This controller continuously estimates the next footstep while walking and running, and automatically switches the stepping strategy based on situational changes. To develop the controller, we devise a new physical model called an inverted-pendulum-based abstract model (IPAM). The proposed abstract model represents high-dimensional character motions, inheriting the naturalness of captured motions by estimating the appropriate footstep location, speed and switching time at every frame. The estimation is achieved by a deep learning based regressor that extracts important features in captured motions. To validate the proposed controller, we train the model using captured motions of a human stopping, walking, and running in a limited space. Then, the motion controller generates humanlike locomotion with continuously varying speeds, transitions between walking and running, and collision response strategies in a cluttered space in real time.