Next Event Estimation++: Visibility Mapping for Efficient Light Transport Simulation
Abstract
Monte-Carlo rendering requires determining the visibility between scene points as the most common and compute intense operation to establish paths between camera and light source. Unfortunately, many tests reveal occlusions and the corresponding paths do not contribute to the final image. In this work, we present next event estimation++ (NEE++): a visibility mapping technique to perform visibility tests in a more informed way by caching voxel to voxel visibility probabilities. We show two scenarios: Russian roulette style rejection of visibility tests and direct importance sampling of the visibility. We show applications to next event estimation and light sampling in a uni-directional path tracer, and light-subpath sampling in Bi-Directional Path Tracing. The technique is simple to implement, easy to add to existing rendering systems, and comes at almost no cost, as the required information can be directly extracted from the rendering process itself. It discards up to 80% of visibility tests on average, while reducing variance by ~20% compared to other state-of-the-art light sampling techniques with the same number of samples. It gracefully handles complex scenes with efficiency similar to Metropolis light transport techniques but with a more uniform convergence.
BibTeX
@article {10.1111:cgf.14138,
journal = {Computer Graphics Forum},
title = {{Next Event Estimation++: Visibility Mapping for Efficient Light Transport Simulation}},
author = {Guo, Jerry Jinfeng and Eisemann, Martin and Eisemann, Elmar},
year = {2020},
publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
ISSN = {1467-8659},
DOI = {10.1111/cgf.14138}
}
journal = {Computer Graphics Forum},
title = {{Next Event Estimation++: Visibility Mapping for Efficient Light Transport Simulation}},
author = {Guo, Jerry Jinfeng and Eisemann, Martin and Eisemann, Elmar},
year = {2020},
publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
ISSN = {1467-8659},
DOI = {10.1111/cgf.14138}
}