CLIPE 2024
Permanent URI for this collection
Browse
Browsing CLIPE 2024 by Issue Date
Now showing 1 - 7 of 7
Results Per Page
Sort Options
Item Capture and Automatic Production of Digital Humans in Real Motion with a Temporal 3D Scanner(The Eurographics Association, 2024) Parrilla, Eduardo; Ballester, Alfredo; Uriel, Jordi; Ruescas-Nicolau, Ana V.; Alemany, Sandra; Pelechano, Nuria; Pettré, JulienThe demand for virtual human characters in Extended Realities (XR) is growing across industries from entertainment to healthcare. Achieving natural behaviour in virtual environments requires digitizing real-world actions, a task typically laborious and requiring specialized expertise. This paper presents an advanced approach for digitizing humans in motion, streamlining the process from capture to virtual character creation. By integrating the proposed hardware, algorithms, and data models, this approach automates the creation of high-resolution assets, reducing manual intervention and software dependencies. The resulting sequences of rigged and textured meshes ensure lifelike virtual characters with detailed facial expressions and hand gestures, surpassing the capabilities of static 3D scans animated via separate motion captures. Robust pose-dependent shape corrections and temporal consistency algorithms guarantee smooth, artifact-free body surfaces in motion, while the export capability in standard formats enhances interoperability and further character development possibilities. Additionally, this method facilitates the efficient creation of large datasets for learning human models, thus representing a significant advancement in XR technologies and digital content creation across industries.Item Interacting with a Virtual Cyclist in Mixed Reality Affects Pedestrian Walking(The Eurographics Association, 2024) Kamalasanan, Vinu; Krüger, Melanie; Sester, Monika; Pelechano, Nuria; Pettré, JulienWhen walking in shared traffic spaces, the nearby presence and movement of other pedestrians and cyclists can prompt individuals to make speed and path adjustments to avoid potential collisions. The study of such collision avoidance strategies in virtual settings allows for the controlled scaling of environmental complexity that are present in a real situation, while ensuring pedestrians safety. Our pilot study in this work makes an early effort towards understanding the influence of cyclist movements on human walking using mixed reality (MR). On this account, the collision avoidance behavior of pedestrians crossing the path of a moving virtual cyclist avatar was examined. This was done by analyzing the temporal and spatial characteristics of the participants walking trajectory using the speed profiles and Post Encroachment Time (PET) metric. The early results from our pilot study demonstrates that mixed reality cyclist experiments can be used to study pedestrian-cyclist interactions. Furthermore, for all interactions that were noted in the study, a significant proportion of participants decided to cross the virtual cyclist, while others preferring to give the right of way. We also discuss our current findings, insights and implications of studying pedestrian behaviours using virtual cyclists.Item CLIPE 2024: Frontmatter(The Eurographics Association, 2024) Pelechano, Nuria; Pettré, Julien; Pelechano, Nuria; Pettré, JulienItem Embodied Augmented Reality for Lower Limb Rehabilitation(The Eurographics Association, 2024) Sarri, Froso; Kasnesis, Panagiotis; Symeonidis, Spyridon; Paraskevopoulos, Ioannis Th.; Diplaris, Sotiris; Posteraro, Federico; Georgoudis, George; Mania, Katerina; Pelechano, Nuria; Pettré, JulienImmersive platforms have emerged as valuable tools in rehabilitation, with potential to enhance patient engagement and recovery outcomes. Addressing the limitations of traditional Virtual Reality (VR) setups that restrict physical movement, this paper presents the system architecture of a novel, head-worn, Augmented Reality (AR) system for lower limb rehabilitation. The rehabilitation experience is enhanced by embodying avatars that replicate patients' movements. The system integrates varied avatar perspectives, such as mirror and follow modes, based on an avatar centered interface. The proposed system architecture supports seated and standing exercises, expanding the scope of rehabilitation beyond just gait. Computer vision-based 3D pose estimation captures patients' movement, mapped onto the avatar in real-time, accurately estimating the co-ordinates of 3D body landmarks. Wearable sensors evaluate patients' movements by utilizing deep learning to discern movement patterns. Feedback to patients is provided based on visual cues indicating limb areas for exercise adjustment so that exercise execution is improved. Embodiment has the potential to improve exercise understanding and assists patients' rehabilitation recovery.Item LexiCrowd: A Learning Paradigm towards Text to Behaviour Parameters for Crowds(The Eurographics Association, 2024) Lemonari, Marilena; Andreou, Nefeli; Pelechano, Nuria; Charalambous, Panayiotis; Chrysanthou, Yiorgos; Pelechano, Nuria; Pettré, JulienCreating believable virtual crowds, controllable by high-level prompts, is essential to creators for trading-off authoring freedom and simulation quality. The flexibility and familiarity of natural language in particular, motivates the use of text to guide the generation process. Capturing the essence of textually described crowd movements in the form of meaningful and usable parameters, is challenging due to the lack of paired ground truth data, and inherent ambiguity between the two modalities. In this work, we leverage a pre-trained Large Language Model (LLM) to create pseudo-pairs of text and behaviour labels. We train a variational auto-encoder (VAE) on the synthetic dataset, constraining the latent space into interpretable behaviour parameters by incorporating a latent label loss. To showcase our model's capabilities, we deploy a survey where humans provide textual descriptions of real crowd datasets. We demonstrate that our model is able to parameterise unseen sentences and produce novel behaviours, capturing the essence of the given sentence; our behaviour space is compatible with simulator parameters, enabling the generation of plausible crowds (text-to-crowds). Also, we conduct feasibility experiments exhibiting the potential of the output text embeddings in the premise of full sentence generation from a behaviour profile.Item Overcoming Challenges of Cycling Motion Capturing and Building a Comprehensive Dataset(The Eurographics Association, 2024) Kyriakou, Panayiotis; Kyriakou, Marios; Chrysanthou, Yiorgos; Pelechano, Nuria; Pettré, JulienThis article describes a methodology for capturing cyclist motion using motion capture (mocap) hardware. It also details the creation of a comprehensive dataset that will be publicly available. The methodology involves a modular system, and an innovative marker placement. The resulting dataset is utilized to create 3D visualizations and diverse data representations, shared in an online library for public access and collaborative research.Item A CRITS Foray Into Cultural Heritage: Background Characters For The SHELeadersVR Project(The Eurographics Association, 2024) Culié, Jean-Benoit; Mijatovic, Bojan; Panzoli, David; Nesimovic, Davud; Sanchez, Stéphane; Rizvic, Selma; Pelechano, Nuria; Pettré, JulienThis article presents CRITS, a software framework designed to enhance virtual environments, particularly in the context of cultural heritage and immersive learning simulations. CRITS enables the easy integration of autonomous, human-like characters into virtual settings, enriching the user's experience by simulating the dynamic activities and social presence of background characters. The framework is showcased through its application in the SHELeaders VR project, which aims to recreate historical settings and narratives centered around medieval female leaders in the Balkans. The article discusses the technical implementation of CRITS, its benefits for creating lively and populated environments, and reflects on potential improvements and future research directions.