dc.contributor.author | Caputo, F. M. | en_US |
dc.contributor.author | Burato, S. | en_US |
dc.contributor.author | Pavan, G. | en_US |
dc.contributor.author | Voillemin, T. | en_US |
dc.contributor.author | Wannous, H. | en_US |
dc.contributor.author | Vandeborre, J. P. | en_US |
dc.contributor.author | Maghoumi, M. | en_US |
dc.contributor.author | Taranta II, E. M. | en_US |
dc.contributor.author | Razmjoo, A. | en_US |
dc.contributor.author | LaViola Jr., J. J. | en_US |
dc.contributor.author | Manganaro, F. | en_US |
dc.contributor.author | Pini, S. | en_US |
dc.contributor.author | Borghi, G. | en_US |
dc.contributor.author | Vezzani, R. | en_US |
dc.contributor.author | Cucchiara, R. | en_US |
dc.contributor.author | Nguyen, H. | en_US |
dc.contributor.author | Tran, M. T. | en_US |
dc.contributor.author | Giachetti, A. | en_US |
dc.contributor.editor | Biasotti, Silvia and Lavoué, Guillaume and Veltkamp, Remco | en_US |
dc.date.accessioned | 2019-05-04T14:06:04Z | |
dc.date.available | 2019-05-04T14:06:04Z | |
dc.date.issued | 2019 | |
dc.identifier.isbn | 978-3-03868-077-2 | |
dc.identifier.issn | 1997-0471 | |
dc.identifier.uri | https://doi.org/10.2312/3dor.20191067 | |
dc.identifier.uri | https://diglib.eg.org:443/handle/10.2312/3dor20191067 | |
dc.description.abstract | This paper presents the results of the Eurographics 2019 SHape Retrieval Contest track on online gesture recognition. The goal of this contest was to test state-of-the-art methods that can be used to online detect command gestures from hands' movements tracking on a basic benchmark where simple gestures are performed interleaving them with other actions. Unlike previous contests and benchmarks on trajectory-based gesture recognition, we proposed an online gesture recognition task, not providing pre-segmented gestures, but asking the participants to find gestures within recorded trajectories. The results submitted by the participants show that an online detection and recognition of sets of very simple gestures from 3D trajectories captured with a cheap sensor can be effectively performed. The best methods proposed could be, therefore, directly exploited to design effective gesture-based interfaces to be used in different contexts, from Virtual and Mixed reality applications to the remote control of home devices. | en_US |
dc.publisher | The Eurographics Association | en_US |
dc.subject | Human | |
dc.subject | centered computing | |
dc.subject | Gestural input | |
dc.title | Online Gesture Recognition | en_US |
dc.description.seriesinformation | Eurographics Workshop on 3D Object Retrieval | |
dc.description.sectionheaders | SHREC Session 2 | |
dc.identifier.doi | 10.2312/3dor.20191067 | |
dc.identifier.pages | 93-102 | |