Towards Light‐Weight Portrait Matting via Parameter Sharing
Abstract
Traditional portrait matting methods typically consist of a trimap estimation network and a matting network. Here, we propose a new light‐weight portrait matting approach, termed parameter‐sharing portrait matting (PSPM). Different from conventional portrait matting models where the encoder and decoder networks in two tasks are often separately designed, here a single encoder is employed for the two tasks in PSPM, while each task still has its task‐specific decoder. Thus, the role of the encoder is to extract semantic features and two decoders function as a bridge between low‐resolution feature maps generated by the encoder and high‐resolution feature maps for pixel‐wise classification/regression. In particular, three variants capable of implementing the parameter‐sharing portrait matting network are proposed and investigated, respectively. As demonstrated in our experiments, model capacity and computation costs can be reduced significantly, by up to and , respectively, with PSPM, whereas the matting accuracy only slightly deteriorates. In addition, qualitative and quantitative evaluations show that sharing the encoder is an effective way to achieve portrait matting with limited computational budgets, indicating a promising direction for applications of real‐time portrait matting on mobile devices.
BibTeX
@article {10.1111:cgf.14179,
journal = {Computer Graphics Forum},
title = {{Towards Light‐Weight Portrait Matting via Parameter Sharing}},
author = {Dai, Yutong and Lu, Hao and Shen, Chunhua},
year = {2021},
publisher = {© 2021 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd},
ISSN = {1467-8659},
DOI = {10.1111/cgf.14179}
}
journal = {Computer Graphics Forum},
title = {{Towards Light‐Weight Portrait Matting via Parameter Sharing}},
author = {Dai, Yutong and Lu, Hao and Shen, Chunhua},
year = {2021},
publisher = {© 2021 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd},
ISSN = {1467-8659},
DOI = {10.1111/cgf.14179}
}