Lifelike Motions for Robotic Characters

dc.contributor.authorSerifi, Agon
dc.date.accessioned2025-11-25T07:16:53Z
dc.date.available2025-11-25T07:16:53Z
dc.date.issued2025-09
dc.description.abstractHumanoids have made significant advances in recent years. Nonetheless, the motions they perform often remain rigid, mechanical, and lack the diversity and expressiveness of human motion. This stands in stark contrast to physics-based simulated characters, which are capable of performing agile and lifelike motions in fully simulated environments. Such characters typically leverage reinforcement learning in combination with motion capture data to learn how to move like humans. However, their success is closely tied to unrealistic modeling assumptions such as simplified dynamics, overpowered actuators, or noise-free sensing. While these assumptions enable efficient and stable training, they hinder the transfer to the real world. In the real world, there are no shortcuts. To achieve more dynamic motions for humanoids, physically accurate simulation and robust learning methods are essential. This requires rethinking many components along the pipeline, starting from the simulators and how to account for sim-to-real gaps, up to questions about how to represent, track, and generate motions for humanoids. In this dissertation, we present several contributions in this direction and bring more lifelike motions to robotic characters. First, we present a learning-based modular simulation augmentation to reduce the sim-to-real gap. Our method can generalize across robot configurations and helps to better estimate the state of the robot. In a second contribution, we propose a novel architecture for encoding motions as a trajectory in latent space. The architecture overcomes the need for absolute positional encoding, leading to better reconstruction quality of various sequential data types. In a third contribution, we show how a pretrained latent space can be leveraged to train more accurate and robust control policies using reinforcement learning. Our two-stage method transfers to the real world and brings dynamic dancing motions to a humanoid robot. Our last contribution physically aligns kinematic motion generators with the capabilities of the character and its control policy. This allows for a more successful transfer of generated motions to the real world. The methods and concepts introduced in this dissertation make robots move more lifelike and reduce the gap to simulated characters. We hope they will inspire future research and bring more believable robots into our world.
dc.identifier.urihttps://diglib.eg.org/handle/10.2312/3607270
dc.language.isoen
dc.publisherETH Zurich
dc.relation.ispartofseriesDiss. ETH; No. 31327
dc.titleLifelike Motions for Robotic Characters
dc.typeThesis
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Dissertation_AgonSerifi.pdf
Size:
55.52 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
34 B
Format:
Item-specific license agreed upon to submission
Description:
Collections