Deep Screen Space for Indirect Lighting of Volumes
Abstract
We present a method to render approximate indirect light transport from surfaces to volumes which is fully dynamic with respect to geometry, the medium and the main light sources, running at interactive speed. This is achieved in a three-step procedure. First, the scene is turned into a view-dependent level-of-detail surfel cloud using fast hardware tessellation. These surfels are lit and represent the senders of indirect light. Second, the current view of the volume is converted into a transmittance interval map, containing depth intervals in which the transmittance to the camera is reduced by the same fraction of the total extinction. These intervals will receive indirect illumination. Finally, surfels and intervals are linked by splatting the effect of the surfels into a hierarchical framebuffer. This linking delivers high precision between surfel-interval pairs that exchange much light and is coarser for pairs exchanging little, without constructing any explicit hierarchical data structure.
BibTeX
@inproceedings {10.2312:vmv.20141287,
booktitle = {Vision, Modeling & Visualization},
editor = {Jan Bender and Arjan Kuijper and Tatiana von Landesberger and Holger Theisel and Philipp Urban},
title = {{Deep Screen Space for Indirect Lighting of Volumes}},
author = {Nalbach, Oliver and Ritschel, Tobias and Seidel, Hans-Peter},
year = {2014},
publisher = {The Eurographics Association},
ISBN = {978-3-905674-74-3},
DOI = {10.2312/vmv.20141287}
}
booktitle = {Vision, Modeling & Visualization},
editor = {Jan Bender and Arjan Kuijper and Tatiana von Landesberger and Holger Theisel and Philipp Urban},
title = {{Deep Screen Space for Indirect Lighting of Volumes}},
author = {Nalbach, Oliver and Ritschel, Tobias and Seidel, Hans-Peter},
year = {2014},
publisher = {The Eurographics Association},
ISBN = {978-3-905674-74-3},
DOI = {10.2312/vmv.20141287}
}