dc.contributor.author | Schirmacher, Hartmut | en_US |
dc.contributor.author | Ming, Li | en_US |
dc.contributor.author | Seidel, Hans-Peter | en_US |
dc.date.accessioned | 2015-02-16T11:05:27Z | |
dc.date.available | 2015-02-16T11:05:27Z | |
dc.date.issued | 2001 | en_US |
dc.identifier.issn | 1467-8659 | en_US |
dc.identifier.uri | http://dx.doi.org/10.1111/1467-8659.00509 | en_US |
dc.description.abstract | We introduce a flexible and powerful concept for reconstructing arbitrary views from multiple source images on the fly. Our approach is based on a Lumigraph structure with per-pixel depth values, and generalizes the classical two-plane parameterized light fields and Lumigraphs. With our technique, it is possible to render arbitrary views of time-varying, non-diffuse scenes at interactive frame rates, and it allows using any kind of sensor that yields images with dense depth information. We demonstrate the flexibility and efficiency of our approach through various examples. | en_US |
dc.publisher | Blackwell Publishers Ltd and the Eurographics Association | en_US |
dc.title | On-the-Fly Processing of Generalized Lumigraphs | en_US |
dc.description.seriesinformation | Computer Graphics Forum | en_US |
dc.description.volume | 20 | en_US |
dc.description.number | 3 | en_US |
dc.identifier.doi | 10.1111/1467-8659.00509 | en_US |
dc.identifier.pages | 165-174 | en_US |