Show simple item record

dc.contributor.authorSantesteban, Igor
dc.date.accessioned2023-01-15T17:19:15Z
dc.date.available2023-01-15T17:19:15Z
dc.date.issued2022-07
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/2633267
dc.description.abstractClothing plays a fundamental role in our everyday lives. When we choose clothing to buy or wear, we guide our decisions based on a combination of fit and style. For this reason, the majority of clothing is purchased at brick-and-mortar retail stores, after physical try-on to test the fit and style of several garments on our own bodies. Computer graphics technology promises an opportunity to support online shopping through virtual try-on, but to date virtual try-on solutions lack the responsiveness of a physical try-on experience. This thesis works towards developing new virtual try-on solutions that meet the demanding requirements of accuracy, interactivity and scalability. To this end, we propose novel data-driven models for 3D avatars and clothing that produce highly realistic results at a fraction of the computational cost of physics-based approaches. Throughout the thesis we also address common limitations of data-driven methods by using self-supervision mechanisms to enforce physical constraints and reduce the dependency on ground-truth data. This allows us to build efficient and accurate models with minimal preprocessing times.en_US
dc.language.isoenen_US
dc.subjectvirtual try-onen_US
dc.subjectsoft-tissue animationen_US
dc.subjectclothing animationen_US
dc.subjectmachine learningen_US
dc.subjecthuman avatarsen_US
dc.titleData-driven models of 3D avatars and clothing for virtual try-onen_US
dc.typeThesisen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record