Neural Temporal Adaptive Sampling and Denoising
Date
2020Author
Hasselgren, Jon
Munkberg, Jacob
Salvi, Marco
Patney, Anjul
Lefohn, Aaron
Metadata
Show full item recordAbstract
Despite recent advances in Monte Carlo path tracing at interactive rates, denoised image sequences generated with few samples per-pixel often yield temporally unstable results and loss of high-frequency details. We present a novel adaptive rendering method that increases temporal stability and image fidelity of low sample count path tracing by distributing samples via spatio-temporal joint optimization of sampling and denoising. Adding temporal optimization to the sample predictor enables it to learn spatio-temporal sampling strategies such as placing more samples in disoccluded regions, tracking specular highlights, etc; adding temporal feedback to the denoiser boosts the effective input sample count and increases temporal stability. The temporal approach also allows us to remove the initial uniform sampling step typically present in adaptive sampling algorithms. The sample predictor and denoiser are deep neural networks that we co-train end-to-end over multiple consecutive frames. Our approach is scalable, allowing trade-off between quality and performance, and runs at near real-time rates while achieving significantly better image quality and temporal stability than previous methods.
BibTeX
@article {10.1111:cgf.13919,
journal = {Computer Graphics Forum},
title = {{Neural Temporal Adaptive Sampling and Denoising}},
author = {Hasselgren, Jon and Munkberg, Jacob and Salvi, Marco and Patney, Anjul and Lefohn, Aaron},
year = {2020},
publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
ISSN = {1467-8659},
DOI = {10.1111/cgf.13919}
}
journal = {Computer Graphics Forum},
title = {{Neural Temporal Adaptive Sampling and Denoising}},
author = {Hasselgren, Jon and Munkberg, Jacob and Salvi, Marco and Patney, Anjul and Lefohn, Aaron},
year = {2020},
publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
ISSN = {1467-8659},
DOI = {10.1111/cgf.13919}
}