dc.contributor.author | Santesteban, Igor | en_US |
dc.contributor.author | Otaduy, Miguel A. | en_US |
dc.contributor.author | Casas, Dan | en_US |
dc.contributor.editor | Alliez, Pierre and Pellacini, Fabio | en_US |
dc.date.accessioned | 2019-05-05T17:41:25Z | |
dc.date.available | 2019-05-05T17:41:25Z | |
dc.date.issued | 2019 | |
dc.identifier.issn | 1467-8659 | |
dc.identifier.uri | https://doi.org/10.1111/cgf.13643 | |
dc.identifier.uri | https://diglib.eg.org:443/handle/10.1111/cgf13643 | |
dc.description.abstract | This paper presents a learning-based clothing animation method for highly efficient virtual try-on simulation. Given a garment, we preprocess a rich database of physically-based dressed character simulations, for multiple body shapes and animations. Then, using this database, we train a learning-based model of cloth drape and wrinkles, as a function of body shape and dynamics. We propose a model that separates global garment fit, due to body shape, from local garment wrinkles, due to both pose dynamics and body shape. We use a recurrent neural network to regress garment wrinkles, and we achieve highly plausible nonlinear effects, in contrast to the blending artifacts suffered by previous methods. At runtime, dynamic virtual try-on animations are produced in just a few milliseconds for garments with thousands of triangles. We show qualitative and quantitative analysis of results. | en_US |
dc.publisher | The Eurographics Association and John Wiley & Sons Ltd. | en_US |
dc.subject | Computing methodologies | |
dc.subject | Physical simulation | |
dc.subject | Neural networks | |
dc.title | Learning-Based Animation of Clothing for Virtual Try-On | en_US |
dc.description.seriesinformation | Computer Graphics Forum | |
dc.description.sectionheaders | Learning to Animate | |
dc.description.volume | 38 | |
dc.description.number | 2 | |
dc.identifier.doi | 10.1111/cgf.13643 | |
dc.identifier.pages | 355-366 | |