AvatarGo: Plug and Play self-avatars for VR
Abstract
The use of self-avatars in a VR application can enhance presence and embodiment which leads to a better user experience. In collaborative VR it also facilitates non-verbal communication. Currently it is possible to track a few body parts with cheap trackers and then apply IK methods to animate a character. However, the correspondence between trackers and avatar joints is typically fixed ad-hoc, which is enough to animate the avatar, but causes noticeable mismatches between the user's body pose and the avatar. In this paper we present a fast and easy to set up system to compute exact offset values, unique for each user, which leads to improvements in avatar movement. Our user study shows that the Sense of Embodiment increased significantly when using exact offsets as opposed to fixed ones. We also allowed the users to see a semitransparent avatar overlaid with their real body to objectively evaluate the quality of the avatar movement with our technique.
BibTeX
@inproceedings {10.2312:egs.20221037,
booktitle = {Eurographics 2022 - Short Papers},
editor = {Pelechano, Nuria and Vanderhaeghe, David},
title = {{AvatarGo: Plug and Play self-avatars for VR}},
author = {Ponton, Jose Luis and Monclús, Eva and Pelechano, Nuria},
year = {2022},
publisher = {The Eurographics Association},
ISSN = {1017-4656},
ISBN = {978-3-03868-169-4},
DOI = {10.2312/egs.20221037}
}
booktitle = {Eurographics 2022 - Short Papers},
editor = {Pelechano, Nuria and Vanderhaeghe, David},
title = {{AvatarGo: Plug and Play self-avatars for VR}},
author = {Ponton, Jose Luis and Monclús, Eva and Pelechano, Nuria},
year = {2022},
publisher = {The Eurographics Association},
ISSN = {1017-4656},
ISBN = {978-3-03868-169-4},
DOI = {10.2312/egs.20221037}
}