Show simple item record

dc.contributor.authorYang, Ruigangen_US
dc.contributor.authorWelch, Gregen_US
dc.contributor.authorBishop, Garyen_US
dc.date.accessioned2015-02-16T07:31:52Z
dc.date.available2015-02-16T07:31:52Z
dc.date.issued2003en_US
dc.identifier.issn1467-8659en_US
dc.identifier.urihttp://dx.doi.org/10.1111/1467-8659.00661en_US
dc.description.abstractWe present a novel use of commodity graphics hardware that effectively combines a plane-sweeping algorithm with view synthesis for real-time, online 3D scene acquisition and view synthesis. Using real-time imagery from a few calibrated cameras, our method can generate new images from nearby viewpoints, estimate a dense depth map from the current viewpoint, or create a textured triangular mesh. We can do each of these without any prior geometric information or requiring any user interaction, in real time and online. The heart of our method is to use programmable Pixel Shader technology to square intensity differences between reference image pixels, and then to choose final colors (or depths) that correspond to the minimum difference, i.e. the most consistent color. In this paper we describe the method, place it in the context of related work in computer graphics and computer vision, and present some results.ACM CSS: I.3.3 Computer Graphics-Bitmap and framebuffer operations, I.4.8 Image Processing and Computer Vision-Depth cues, Stereoen_US
dc.publisherBlackwell Science Ltd and the Eurographics Associationen_US
dc.titleReal-Time Consensus-Based Scene Reconstruction Using Commodity Graphics Hardware?en_US
dc.description.seriesinformationComputer Graphics Forumen_US
dc.description.volume22en_US
dc.description.number2en_US
dc.identifier.doi10.1111/1467-8659.00661en_US
dc.identifier.pages207-216en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record