Show simple item record

dc.contributor.authorDuke, D.J.en_US
dc.date.accessioned2014-10-21T07:37:54Z
dc.date.available2014-10-21T07:37:54Z
dc.date.issued1995en_US
dc.identifier.issn1467-8659en_US
dc.identifier.urihttp://dx.doi.org/10.1111/j.1467-8659.1995.cgf143-0055.xen_US
dc.description.abstractMany of the reported developments in the design of virtual spaces or visualisation systems are based on improvements in technology, either physical devices or algorithms for achieving realistic renderings within real-time constraints. While this experimental approach produces a wealth of empirical results, it operates largely without a sound underlying theory that can be used to design systems that will effectively support users in real-world domains. One of the main problems is that these sophisticated technologies rely on, but rarely assess, the cognitive abilities of the user. This paper introduces a new approach to modelling human-system interaction. A syndetic model combines a formal expression of system behaviour with an approximate representation of cognitive resources to allow reasoning about the flow and utilisation of information within the combined system. The power of the approach to provide insight into novel interaction techniques is illustrated by developing a syndetic model of a gesture-driven user interface.en_US
dc.publisherBlackwell Science Ltd and the Eurographics Associationen_US
dc.titleReasoning About Gestural Interactionen_US
dc.description.seriesinformationComputer Graphics Forumen_US
dc.description.volume14en_US
dc.description.number3en_US
dc.identifier.doi10.1111/j.1467-8659.1995.cgf143-0055.xen_US
dc.identifier.pages55-66en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record