A Unified Neural Network for Panoptic Segmentation
Abstract
In this paper, we propose a unified neural network for panoptic segmentation, a task aiming to achieve more fine-grained segmentation. Following existing methods combining semantic and instance segmentation, our method relies on a triple-branch neural network for tackling the unifying work. In the first stage, we adopt a ResNet50 with a feature pyramid network (FPN) as shared backbone to extract features. Then each branch leverages the shared feature maps and serves as the stuff, things, or mask branch. Lastly, the outputs are fused following a well-designed strategy. Extensive experimental results on MS-COCO dataset demonstrate that our approach achieves a competitive Panoptic Quality (PQ) metric score with the state of the art.
BibTeX
@article {10.1111:cgf.13852,
journal = {Computer Graphics Forum},
title = {{A Unified Neural Network for Panoptic Segmentation}},
author = {Yao, Li and Chyau, Ang},
year = {2019},
publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
ISSN = {1467-8659},
DOI = {10.1111/cgf.13852}
}
journal = {Computer Graphics Forum},
title = {{A Unified Neural Network for Panoptic Segmentation}},
author = {Yao, Li and Chyau, Ang},
year = {2019},
publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
ISSN = {1467-8659},
DOI = {10.1111/cgf.13852}
}