Show simple item record

dc.contributor.authorMartos, Antonioen_US
dc.contributor.authorRuiz, Bernardinoen_US
dc.contributor.editor-en_US
dc.date.accessioned2015-04-27T14:51:38Z
dc.date.available2015-04-27T14:51:38Z
dc.date.issued2013en_US
dc.identifier.urihttp://dx.doi.org/10.1109/DigitalHeritage.2013.6743722en_US
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1109/DigitalHeritage
dc.description.abstractExisting technologies for contact-less 3D scanning and Image Based Modelling (IBM) methods are being extensively used nowadays to digitize cultural heritage elements. With a convenient degree of automation these methods can properly capture and reproduce shape and basic colour textures. However, there is usually a quite evident lack of fidelity in the resulting appearance of the virtual reproductions when compared with the original items. Even when properly photo-textured, the reproduced surfaces often resemble either plaster or plastic, regardless of the properties of the original materials. What is neither captured nor modelled is the natural dynamic response of the actual materials with respect to changes in observation angle and/or the lighting arrangement. The methodology introduced in this paper tries to improve the three-dimensional digitalization and visualization of cultural heritage elements, by extending the present capabilities of IBM with additional capture and modelling of surface appearance. We show that it is possible to automatically reproduce realistic-looking virtual objects and scenes, even with photographs taken with an uncalibrated single moving camera and while under uncontrolled and intentionally variable lighting conditions. This is achieved not only by reconstructing the shape and projecting colour texture maps from photographs, but also modelling and mapping the apparent optical response of the surfaces to light changes, while also determining the variable distribution of environmental illumination of the original scene. This novel approach integrates Physically Based Render (PBR) concepts in a processing loop that combines capture and visualization. Using the information contained in different photographs, where the appearance of the object surface changes with environmental light variations, we show that it is possible to enhance the information contained in the usual colour texture maps with additional layers. This enables the reproduc- ion of finer details of surface normals and relief, as well as effective approximations of the Bi-directional Reflectance Distribution Function (BRDF). The appearance of the surfaces can then be reproduced with a dedicated render engine providing unusual levels of detail and realism due to enriched multi-layer texture maps and custom shading functions. This methodology will be introduced with a real case-study, to illustrate its practical applicability and flexibility; The virtual reproduction of the Lady of Elche was performed only from archived photographs taken at the museum for different documentation purposes, using uncalibrated optics and an uncontrolled studio light arrangement. We discuss the capture on larger architectural elements as well, with uncontrolled (yet still variable) illumination in outdoor environments and challenging items with difficult to capture surfaces such as the brass sculpture of La Regenta, where proper reproduction of surface reflection and environmental lights are fundamental steps to provide a good visualization experience. These cases will show the feasibility of working with field calibration and initial approximations for the camera model and light-maps, addressing thus the flexibility required for practical field documentation in museum environments or outdoors. The potential for diffusion will be shown with the use of open source software tools for enhanced visualization. The presented capture methods are integrated with the specific adaptation of open-source GPU-based (Graphics Processing Unit) render engines to produce two flavours of 3D inspection/visualization tools with proper relighting capabilities, able to reveal very subtle details: A quasi-real time realistic engine (Blender Cycles), which is also the basis for the capture process and is focused on realistic reproduction, and a real-time version based on customized pixel shaders, for the real-time visualization of lightweight models on web browsers and other interacen_US
dc.publisherThe Eurographics Associationen_US
dc.subject{Camerasen_US
dc.subjectImage color analysisen_US
dc.subjectLightingen_US
dc.subjectMaterialsen_US
dc.subjectSolid modelingen_US
dc.subjectSurface textureen_US
dc.subjectThreeen_US
dc.subjectdimensional displaysen_US
dc.subjectBRDF Captureen_US
dc.subjectImage Based Modellingen_US
dc.subjectLady of Elcheen_US
dc.subjectPhysically Based Renderen_US
dc.subjectRealen_US
dc.subjecttime Visualization}en_US
dc.titleRealistic Virtual Reproductions. Image-based modelling of geometry and appearanceen_US
dc.description.seriesinformationDigital Heritage International Congressen_US
dc.description.sectionheadersTrack 1, Full Papersen_US
dc.identifier.doi10.1109/DigitalHeritage.2013.6743722en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record