Progressively-Refined Reflectance Functions from Natural Illumination
Abstract
In this paper we present a simple, robust, and efficient algorithm for estimating reflectance fields (i.e., a description of the transport of light through a scene) for a fixed viewpoint using images of the scene under known natural illumination. Our algorithm treats the scene as a black-box linear system that transforms an input signal (the incident light) into an output signal (the reflected light). The algorithm is hierarchical - it progressively refines the approximation of the reflectance field with an increasing number of training samples until the required precision is reached. Our method relies on a new representation for reflectance fields. This representation is compact, can be progressively refined, and quickly computes the relighting of scenes with complex illumination. Our representation and the corresponding algorithm allow us to efficiently estimate the reflectance fields of scenes with specular, glossy, refractive, and diffuse elements. The method also handles soft and hard shadows, inter-reflections, caustics, and subsurface scattering. We verify our algorithm and representation using two measurement setups and several scenes, including an outdoor view of the city of Cambridge.
BibTeX
@inproceedings {10.2312:EGWR:EGSR04:299-308,
booktitle = {Eurographics Workshop on Rendering},
editor = {Alexander Keller and Henrik Wann Jensen},
title = {{Progressively-Refined Reflectance Functions from Natural Illumination}},
author = {Matusik, Wojciech and Loper, Matthew and Pfister, Hanspeter},
year = {2004},
publisher = {The Eurographics Association},
ISSN = {1727-3463},
ISBN = {3-905673-12-6},
DOI = {10.2312/EGWR/EGSR04/299-308}
}
booktitle = {Eurographics Workshop on Rendering},
editor = {Alexander Keller and Henrik Wann Jensen},
title = {{Progressively-Refined Reflectance Functions from Natural Illumination}},
author = {Matusik, Wojciech and Loper, Matthew and Pfister, Hanspeter},
year = {2004},
publisher = {The Eurographics Association},
ISSN = {1727-3463},
ISBN = {3-905673-12-6},
DOI = {10.2312/EGWR/EGSR04/299-308}
}