Show simple item record

dc.contributor.authorOshita, Masakien_US
dc.contributor.editorJoaquim Armando Pires Jorge and Eric Galin and John F. Hughesen_US
dc.date.accessioned2014-01-27T18:22:27Z
dc.date.available2014-01-27T18:22:27Z
dc.date.issued2004en_US
dc.identifier.isbn3-905673-16-9en_US
dc.identifier.issn1812-3503en_US
dc.identifier.urihttp://dx.doi.org/10.2312/SBM/SBM04/043-052en_US
dc.description.abstractThis paper presents a pen-based intuitive interface to control a virtual human figure interactively. Recent commercial pen devices can detect not only the pen positions but also the pressure and tilt of the pen. We utilize such information to make a human figure perform various types of motions in response to the pen movements manipulated by the user. A figure walks, runs, turns and steps along the trajectory and speed of the pen. The figure also bends, stretches and tilts in response to the tilt of the pen. Moreover, it ducks and jumps in response to the pen pressure. Using this interface, the user controls a virtual human figure intuitively as if he or she were holding a virtual puppet and playing with it. In addition to the interface design, this paper describes a motion generation engine to produce various motions based on the parameters that are given by the pen interface. We take a motion blending approach and construct motion blending modules with a set of small number of motion capture data for each type of motions. Finally, we discuss about the effectiveness and limitations of the interface based on some preliminary experiments.en_US
dc.publisherThe Eurographics Associationen_US
dc.titlePen-to-mime: A Pen-Based Interface for Interactive Control of A Human Figureen_US
dc.description.seriesinformationSketch Based Interfaces and Modelingen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record