Towards Self-Perception in Augmented Virtuality: Hand Segmentation with Fully Convolutional Networks
Date
2018Author
Gonzalez-Sosa, Ester
Perez, Pablo
Kachach, Redouane
Ruiz, Jaime Jesus
Villegas, Alvaro
Metadata
Show full item recordAbstract
In this work, we propose the use of deep learning techniques to segment items of interest from the local region to increase self-presence in Virtual Reality (VR) scenarios. Our goal is to segment hand images from the perspective of a user wearing a VR headset. We create the VR Hand Dataset, composed of more than 10:000 images, including variations of hand position, scenario, outfits, sleeve and people. We also describe the procedure followed to automatically generate groundtruth images and create synthetic images. Preliminary results look promising.
BibTeX
@inproceedings {10.2312:egp.20181012,
booktitle = {EG 2018 - Posters},
editor = {Jain, Eakta and Kosinka, Jirí},
title = {{Towards Self-Perception in Augmented Virtuality: Hand Segmentation with Fully Convolutional Networks}},
author = {Gonzalez-Sosa, Ester and Perez, Pablo and Kachach, Redouane and Ruiz, Jaime Jesus and Villegas, Alvaro},
year = {2018},
publisher = {The Eurographics Association},
ISSN = {1017-4656},
DOI = {10.2312/egp.20181012}
}
booktitle = {EG 2018 - Posters},
editor = {Jain, Eakta and Kosinka, Jirí},
title = {{Towards Self-Perception in Augmented Virtuality: Hand Segmentation with Fully Convolutional Networks}},
author = {Gonzalez-Sosa, Ester and Perez, Pablo and Kachach, Redouane and Ruiz, Jaime Jesus and Villegas, Alvaro},
year = {2018},
publisher = {The Eurographics Association},
ISSN = {1017-4656},
DOI = {10.2312/egp.20181012}
}