Show simple item record

dc.contributor.authorDuchowski, Andrewen_US
dc.contributor.authorMarmitt, Gerden_US
dc.date.accessioned2015-11-12T07:16:50Z
dc.date.available2015-11-12T07:16:50Z
dc.date.issued2002en_US
dc.identifier.issn1017-4656en_US
dc.identifier.urihttp://dx.doi.org/10.2312/egs.20021022en_US
dc.description.abstractDynamic human vision is an important contributing factor to the design of perceptually-based Virtual Reality. A common strategy relies on either an implicit assumption or explicit measurement of gaze direction. Given the spatial location of foveal vision, computational resources are directed at enhancing the foveated region in realtime. To obtain an explicit gaze measurement, an eye tracker may be used. In the absence of an eye tracker, a computational model of visual attention may be substituted to predict visually salient features. The fidelity of the resultant real-time system hinges on the agreement between predicted and actual regions foveated by the human. The contributions of this paper are the development and evaluation of a novel method for the comparison of human and artificial scanpaths recorded in VR. The novelty of the present approach is the application of previous accuracy measures to scanpath comparison in VR where analysis is complicated by head movements and dynamic imagery. An attentional model previously used for view-dependent enhancement of Virtual Reality is evaluated. Analysis shows that the correlation between human and artificial scanpaths is much lower than expected. Recommendations are made for improvements to the model to foster closer correspondence to human attentional patterns in VR.en_US
dc.publisherEurographics Associationen_US
dc.titleModeling Visual Attention in VR: Measuring the Accuracy of Predicted Scanpathsen_US
dc.description.seriesinformationEurographics 2002 - Short Presentationsen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record