Adaptive Multi-view Path Tracing
Date
2019Author
Fraboni, Basile
Iehl, Jean-Claude
Nivoliers, Vincent
Bouchard, Guillaume
Metadata
Show full item recordAbstract
Rendering photo-realistic image sequences using path tracing and Monte Carlo integration often requires sampling a large number of paths to get converged results. In the context of rendering multiple views or animated sequences, such sampling can be highly redundant. Several methods have been developed to share sampled paths between spatially or temporarily similar views. However, such sharing is challenging since it can lead to bias in the final images. Our contribution is a Monte Carlo sampling technique which generates paths, taking into account several cameras. First, we sample the scene from all the cameras to generate hit points. Then, an importance sampling technique generates bouncing directions which are shared by a subset of cameras. This set of hit points and bouncing directions is then used within a regular path tracing solution. For animated scenes, paths remain valid for a fixed time only, but sharing can still occur between cameras as long as their exposure time intervals overlap. We show that our technique generates less noise than regular path tracing and does not introduce noticeable bias.
BibTeX
@inproceedings {10.2312:sr.20191217,
booktitle = {Eurographics Symposium on Rendering - DL-only and Industry Track},
editor = {Boubekeur, Tamy and Sen, Pradeep},
title = {{Adaptive Multi-view Path Tracing}},
author = {Fraboni, Basile and Iehl, Jean-Claude and Nivoliers, Vincent and Bouchard, Guillaume},
year = {2019},
publisher = {The Eurographics Association},
ISSN = {1727-3463},
ISBN = {978-3-03868-095-6},
DOI = {10.2312/sr.20191217}
}
booktitle = {Eurographics Symposium on Rendering - DL-only and Industry Track},
editor = {Boubekeur, Tamy and Sen, Pradeep},
title = {{Adaptive Multi-view Path Tracing}},
author = {Fraboni, Basile and Iehl, Jean-Claude and Nivoliers, Vincent and Bouchard, Guillaume},
year = {2019},
publisher = {The Eurographics Association},
ISSN = {1727-3463},
ISBN = {978-3-03868-095-6},
DOI = {10.2312/sr.20191217}
}