dc.contributor.author | Hu, Lin-Chuan | en_US |
dc.contributor.author | Chang, Ming-Hsu | en_US |
dc.contributor.author | Chuang, Yung-Yu | en_US |
dc.contributor.editor | Luis Gonzaga Magalhaes and Rafal Mantiuk | en_US |
dc.date.accessioned | 2016-04-26T07:50:46Z | |
dc.date.available | 2016-04-26T07:50:46Z | |
dc.date.issued | 2016 | en_US |
dc.identifier.issn | 1017-4656 | en_US |
dc.identifier.uri | http://dx.doi.org/10.2312/egp.20161039 | en_US |
dc.description.abstract | This paper presents a framework for transferring rig parameters from a source animation to a target model, allowing artists to further refine and adjust the animation. Most previous methods only transfer animations to meshes or joint parameters. However, in industry, character animations are usually manipulated by rigs. Thus, it is difficult for artists to work further on the retargeted animations. Our method first applies motion transfer to deform the target model to mimic the source motion. Next, we estimate the rig parameters which satisfy the following properties: (1) the resultant animation resembles the retargeted animation and (2) the rig parameters match the artist's editing conventions. Artists could refine the produced rig parameters and the edits are propagated throughout the whole animation. | en_US |
dc.publisher | The Eurographics Association | en_US |
dc.subject | I.3.7 [Computer Graphics] | en_US |
dc.subject | Three Dimensional Graphics and Realism | en_US |
dc.subject | Animation | en_US |
dc.title | Rig-Space Motion Retargeting | en_US |
dc.description.seriesinformation | EG 2016 - Posters | en_US |
dc.description.sectionheaders | Posters | en_US |
dc.identifier.doi | 10.2312/egp.20161039 | en_US |
dc.identifier.pages | 5-6 | en_US |