Show simple item record

dc.contributor.authorZeng, Ruien_US
dc.contributor.authorDai, Juen_US
dc.contributor.authorBai, Junxuanen_US
dc.contributor.authorPan, Junjunen_US
dc.contributor.authorQin, Hongen_US
dc.contributor.editorLee, Sung-Hee and Zollmann, Stefanie and Okabe, Makoto and Wünsche, Burkharden_US
dc.date.accessioned2021-10-14T10:05:37Z
dc.date.available2021-10-14T10:05:37Z
dc.date.issued2021
dc.identifier.isbn978-3-03868-162-5
dc.identifier.urihttps://doi.org/10.2312/pg.20211383
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/pg20211383
dc.description.abstractModeling motion dynamics for precise and rapid control by deterministic data-driven models is challenging due to the natural randomness of human motion. To address it, we propose a novel framework for continuous motion control by probabilistic latent variable models. The control is implemented by recurrently querying between historical and target motion states rather than exact motion data. Our model takes a conditional encoder-decoder form in two stages. Firstly, we utilize Gaussian Process Latent Variable Model (GPLVM) to project motion poses to a compact latent manifold. Motion states could be clearly recognized by analyzing on the manifold, such as walking phase and forwarding velocity. Secondly, taking manifold as prior, a Recurrent Neural Network (RNN) encoder makes temporal latent prediction from the previous and control states. An attention module then morphs the prediction by measuring latent similarities to control states and predicted states, thus dynamically preserving contextual consistency. In the end, the GP decoder reconstructs motion states back to motion frames. Experiments on walking datasets show that our model is able to maintain motion states autoregressively while performing rapid and smooth transitions for the control.en_US
dc.publisherThe Eurographics Associationen_US
dc.subjectComputing methodologies
dc.subjectMotion processing
dc.subjectMotion capture
dc.subjectMotion path planning
dc.subjectLearning latent representations
dc.titleHuman Motion Synthesis and Control via Contextual Manifold Embeddingen_US
dc.description.seriesinformationPacific Graphics Short Papers, Posters, and Work-in-Progress Papers
dc.description.sectionheadersFast Rendering and Movement
dc.identifier.doi10.2312/pg.20211383
dc.identifier.pages25-30


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record