Real-time Depth of Field Rendering via Dynamic Light Field Generation and Filtering
Abstract
We present a new algorithm for efficient rendering of high-quality depth-of-field (DoF) effects. We start with a single rasterized view (reference view) of the scene, and sample the light field by warping the reference view to nearby views. We implement the algorithm using NVIDIA s CUDA to achieve parallel processing, and exploit the atomic operations to resolve visibility when multiple pixels warp to the same image location. We then directly synthesize DoF effects from the sampled light field. To reduce aliasing artifacts, we propose an image-space filtering technique that compensates for spatial undersampling using MIP mapping. The main advantages of our algorithm are its simplicity and generality. We demonstrate interactive rendering of DoF effects in several complex scenes. Compared to existing methods, ours does not require ray tracing and hence scales well with scene complexity.
BibTeX
@article {10.1111:j.1467-8659.2010.01797.x,
journal = {Computer Graphics Forum},
title = {{Real-time Depth of Field Rendering via Dynamic Light Field Generation and Filtering}},
author = {Yu, Xuan and Wang, Rui and Yu, Jingyi},
year = {2010},
publisher = {The Eurographics Association and Blackwell Publishing Ltd},
ISSN = {1467-8659},
DOI = {10.1111/j.1467-8659.2010.01797.x}
}
journal = {Computer Graphics Forum},
title = {{Real-time Depth of Field Rendering via Dynamic Light Field Generation and Filtering}},
author = {Yu, Xuan and Wang, Rui and Yu, Jingyi},
year = {2010},
publisher = {The Eurographics Association and Blackwell Publishing Ltd},
ISSN = {1467-8659},
DOI = {10.1111/j.1467-8659.2010.01797.x}
}