Show simple item record

dc.contributor.authorSabbadin, Manueleen_US
dc.contributor.authorPalma, Gianpaoloen_US
dc.contributor.authorBANTERLE, FRANCESCOen_US
dc.contributor.authorBoubekeur, Tamyen_US
dc.contributor.authorCignoni, Paoloen_US
dc.contributor.editorLee, Jehee and Theobalt, Christian and Wetzstein, Gordonen_US
dc.date.accessioned2019-10-14T05:09:31Z
dc.date.available2019-10-14T05:09:31Z
dc.date.issued2019
dc.identifier.issn1467-8659
dc.identifier.urihttps://doi.org/10.1111/cgf.13857
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1111/cgf13857
dc.description.abstractAcquired 3D point clouds make possible quick modeling of virtual scenes from the real world.With modern 3D capture pipelines, each point sample often comes with additional attributes such as normal vector and color response. Although rendering and processing such data has been extensively studied, little attention has been devoted using the light transport hidden in the recorded per-sample color response to relight virtual objects in visual effects (VFX) look-dev or augmented reality (AR) scenarios. Typically, standard relighting environment exploits global environment maps together with a collection of local light probes to reflect the light mood of the real scene on the virtual object. We propose instead a unified spatial approximation of the radiance and visibility relationships present in the scene, in the form of a colored point cloud. To do so, our method relies on two core components: High Dynamic Range (HDR) expansion and real-time Point-Based Global Illumination (PBGI). First, since an acquired color point cloud typically comes in Low Dynamic Range (LDR) format, we boost it using a single HDR photo exemplar of the captured scene that can cover part of it. We perform this expansion efficiently by first expanding the dynamic range of a set of renderings of the point cloud and then projecting these renderings on the original cloud. At this stage, we propagate the expansion to the regions not covered by the renderings or with low-quality dynamic range by solving a Poisson system. Then, at rendering time, we use the resulting HDR point cloud to relight virtual objects, providing a diffuse model of the indirect illumination propagated by the environment. To do so, we design a PBGI algorithm that exploits the GPU's geometry shader stage as well as a new mipmapping operator, tailored for G-buffers, to achieve real-time performances. As a result, our method can effectively relight virtual objects exhibiting diffuse and glossy physically-based materials in real time. Furthermore, it accounts for the spatial embedding of the object within the 3D environment. We evaluate our approach on manufactured scenes to assess the error introduced at every step from the perfect ground truth. We also report experiments with real captured data, covering a range of capture technologies, from active scanning to multiview stereo reconstruction.en_US
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.subjectComputing methodologies
dc.subjectComputer graphics
dc.subjectRendering
dc.subjectRasterization
dc.subjectImage processing
dc.subjectPoint
dc.subjectbased models
dc.subjectMixed / augmented reality
dc.titleHigh Dynamic Range Point Clouds for Real-Time Relightingen_US
dc.description.seriesinformationComputer Graphics Forum
dc.description.sectionheadersGlobal Illumination
dc.description.volume38
dc.description.number7
dc.identifier.doi10.1111/cgf.13857
dc.identifier.pages513-525


Files in this item

Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

  • 38-Issue 7
    Pacific Graphics 2019 - Symposium Proceedings

Show simple item record