dc.contributor.author | Kán, Peter | en_US |
dc.contributor.author | Kaufmann, Hannes | en_US |
dc.contributor.editor | Carlos Andujar and Enrico Puppo | en_US |
dc.date.accessioned | 2013-11-08T10:28:50Z | |
dc.date.available | 2013-11-08T10:28:50Z | |
dc.date.issued | 2012 | en_US |
dc.identifier.issn | 1017-4656 | en_US |
dc.identifier.uri | http://dx.doi.org/10.2312/conf/EG2012/short/089-092 | en_US |
dc.description.abstract | We present a novel method for rendering and compositing video in augmented reality. We focus on calculating the physically correct result of the depth of field caused by a lens with finite sized aperture. In order to correctly simulate light transport, ray-tracing is used and in a single pass combined with differential rendering to compose the final augmented video. The image is fully rendered on GPUs, therefore an augmented video can be produced at interactive frame rates in high quality. Our method runs on the fly, no video postprocessing is needed. In addition we evaluated the user experiences with our rendering system with the hypothesis that a depth of field effect in augmented reality increases the realistic look of composited video. Results with 30 users show that 90% perceive videos with a depth of field considerably more realistic. | en_US |
dc.publisher | The Eurographics Association | en_US |
dc.subject | Categories and Subject Descriptors (according to ACM CCS): I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism-Raytracing I.3.8 [Computer Graphics]: Applications- H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems-Artificial, augmented, and virtual realities | en_US |
dc.title | Physically-Based Depth of Field in Augmented Reality | en_US |
dc.description.seriesinformation | Eurographics 2012 - Short Papers | en_US |