Active Scene Understanding via Online Semantic Reconstruction
Date
2019Metadata
Show full item recordAbstract
We propose a novel approach to robot-operated active understanding of unknown indoor scenes, based on online RGBD reconstruction with semantic segmentation. In our method, the exploratory robot scanning is both driven by and targeting at the recognition and segmentation of semantic objects from the scene. Our algorithm is built on top of a volumetric depth fusion framework and performs real-time voxel-based semantic labeling over the online reconstructed volume. The robot is guided by an online estimated discrete viewing score field (VSF) parameterized over the 3D space of 2D location and azimuth rotation. VSF stores for each grid the score of the corresponding view, which measures how much it reduces the uncertainty (entropy) of both geometric reconstruction and semantic labeling. Based on VSF, we select the next best views (NBV) as the target for each time step. We then jointly optimize the traverse path and camera trajectory between two adjacent NBVs, through maximizing the integral viewing score (information gain) along path and trajectory. Through extensive evaluation, we show that our method achieves efficient and accurate online scene parsing during exploratory scanning.
BibTeX
@article {10.1111:cgf.13820,
journal = {Computer Graphics Forum},
title = {{Active Scene Understanding via Online Semantic Reconstruction}},
author = {Zheng, Lintao and Zhu, Chenyang and Zhang, Jiazhao and Zhao, Hang and Huang, Hui and Niessner, Matthias and Xu, Kai},
year = {2019},
publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
ISSN = {1467-8659},
DOI = {10.1111/cgf.13820}
}
journal = {Computer Graphics Forum},
title = {{Active Scene Understanding via Online Semantic Reconstruction}},
author = {Zheng, Lintao and Zhu, Chenyang and Zhang, Jiazhao and Zhao, Hang and Huang, Hui and Niessner, Matthias and Xu, Kai},
year = {2019},
publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
ISSN = {1467-8659},
DOI = {10.1111/cgf.13820}
}