Show simple item record

dc.contributor.authorKán, Peteren_US
dc.contributor.authorKaufmann, Hannesen_US
dc.contributor.editorCarlos Andujar and Enrico Puppoen_US
dc.date.accessioned2013-11-08T10:28:50Z
dc.date.available2013-11-08T10:28:50Z
dc.date.issued2012en_US
dc.identifier.issn1017-4656en_US
dc.identifier.urihttp://dx.doi.org/10.2312/conf/EG2012/short/089-092en_US
dc.description.abstractWe present a novel method for rendering and compositing video in augmented reality. We focus on calculating the physically correct result of the depth of field caused by a lens with finite sized aperture. In order to correctly simulate light transport, ray-tracing is used and in a single pass combined with differential rendering to compose the final augmented video. The image is fully rendered on GPUs, therefore an augmented video can be produced at interactive frame rates in high quality. Our method runs on the fly, no video postprocessing is needed. In addition we evaluated the user experiences with our rendering system with the hypothesis that a depth of field effect in augmented reality increases the realistic look of composited video. Results with 30 users show that 90% perceive videos with a depth of field considerably more realistic.en_US
dc.publisherThe Eurographics Associationen_US
dc.subjectCategories and Subject Descriptors (according to ACM CCS): I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism-Raytracing I.3.8 [Computer Graphics]: Applications- H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems-Artificial, augmented, and virtual realitiesen_US
dc.titlePhysically-Based Depth of Field in Augmented Realityen_US
dc.description.seriesinformationEurographics 2012 - Short Papersen_US


Files in this item

Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record