Online Adaptive PCA for Inverse Kinematics Hand Tracking
Abstract
Recent approaches to real-time bare hand tracking estimate the hand's pose and posture by fitting a virtual hand model to RGBD sensor data using inverse kinematics. It has been shown that exploiting natural hand synergies can improve the efficiency and quality of the tracking, by performing the optimization in a reduced parameter space consisting of realistic hand postures [SMRB14]. The downside, however, is that only postures within this subspace can be tracked reliably, thereby trading off flexibility and accuracy for performance and robustness. In this paper we extend the previous method by introducing an adaptive synergistic model that is automatically adjusted to observed hand articulations that are not covered by the initial subspace. Our adaptive model combines the robustness of tracking in a reduced parameter space with the flexibility of optimizing for the full articulation of the hand, which we demonstrate in several synthetic and real-world experiments.
BibTeX
@inproceedings {10.2312:vmv.20141283,
booktitle = {Vision, Modeling & Visualization},
editor = {Jan Bender and Arjan Kuijper and Tatiana von Landesberger and Holger Theisel and Philipp Urban},
title = {{Online Adaptive PCA for Inverse Kinematics Hand Tracking}},
author = {Schröder, Matthias and Botsch, Mario},
year = {2014},
publisher = {The Eurographics Association},
ISBN = {978-3-905674-74-3},
DOI = {10.2312/vmv.20141283}
}
booktitle = {Vision, Modeling & Visualization},
editor = {Jan Bender and Arjan Kuijper and Tatiana von Landesberger and Holger Theisel and Philipp Urban},
title = {{Online Adaptive PCA for Inverse Kinematics Hand Tracking}},
author = {Schröder, Matthias and Botsch, Mario},
year = {2014},
publisher = {The Eurographics Association},
ISBN = {978-3-905674-74-3},
DOI = {10.2312/vmv.20141283}
}