Improving NeRF Quality by Progressive Camera Placement for Free-Viewpoint Navigation
Abstract
Neural Radiance Fields, or NeRFs, have drastically improved novel view synthesis and 3D reconstruction for rendering. NeRFs achieve impressive results on object-centric reconstructions, but the quality of novel view synthesis with free-viewpoint navigation in complex environments (rooms, houses, etc) is often problematic. While algorithmic improvements play an important role in the resulting quality of novel view synthesis, in this work, we show that because optimizing a NeRF is inherently a data-driven process, good quality data play a fundamental role in the final quality of the reconstruction. As a consequence, it is critical to choose the data samples - in this case the cameras - in a way that will eventually allow the optimization to converge to a solution that allows free-viewpoint navigation with good quality. Our main contribution is an algorithm that efficiently proposes new camera placements that improve visual quality with minimal assumptions. Our solution can be used with any NeRF model and outperforms baselines and similar work.
BibTeX
@inproceedings {10.2312:vmv.20231222,
booktitle = {Vision, Modeling, and Visualization},
editor = {Guthe, Michael and Grosch, Thorsten},
title = {{Improving NeRF Quality by Progressive Camera Placement for Free-Viewpoint Navigation}},
author = {Kopanas, Georgios and Drettakis, George},
year = {2023},
publisher = {The Eurographics Association},
ISBN = {978-3-03868-232-5},
DOI = {10.2312/vmv.20231222}
}
booktitle = {Vision, Modeling, and Visualization},
editor = {Guthe, Michael and Grosch, Thorsten},
title = {{Improving NeRF Quality by Progressive Camera Placement for Free-Viewpoint Navigation}},
author = {Kopanas, Georgios and Drettakis, George},
year = {2023},
publisher = {The Eurographics Association},
ISBN = {978-3-03868-232-5},
DOI = {10.2312/vmv.20231222}
}