Global Texture Mapping for Dynamic Objects
Date
2019Metadata
Show full item recordAbstract
We propose a novel framework to generate a global texture atlas for a deforming geometry. Our approach distinguishes from prior arts in two aspects. First, instead of generating a texture map for each timestamp to color a dynamic scene, our framework reconstructs a global texture atlas that can be consistently mapped to a deforming object. Second, our approach is based on a single RGB-D camera, without the need of a multiple-camera setup surrounding a scene. In our framework, the input is a 3D template model with an RGB-D image sequence, and geometric warping fields are found using a state-of-the-art non-rigid registration method [GXW*15] to align the template mesh to noisy and incomplete input depth images. With these warping fields, our multi-scale approach for texture coordinate optimization generates a sharp and clear texture atlas that is consistent with multiple color observations over time. Our approach is accelerated by graphical hardware and provides a handy configuration to capture a dynamic geometry along with a clean texture atlas. We demonstrate our approach with practical scenarios, particularly human performance capture. We also show that our approach is resilient on misalignment issues caused by imperfect estimation of warping fields and inaccurate camera parameters.
BibTeX
@article {10.1111:cgf.13872,
journal = {Computer Graphics Forum},
title = {{Global Texture Mapping for Dynamic Objects}},
author = {Kim, Jungeon and Kim, Hyomin and Park, Jaesik and Lee, Seungyong},
year = {2019},
publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
ISSN = {1467-8659},
DOI = {10.1111/cgf.13872}
}
journal = {Computer Graphics Forum},
title = {{Global Texture Mapping for Dynamic Objects}},
author = {Kim, Jungeon and Kim, Hyomin and Park, Jaesik and Lee, Seungyong},
year = {2019},
publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
ISSN = {1467-8659},
DOI = {10.1111/cgf.13872}
}