Acquisition, Encoding and Rendering of Material Appearance Using Compact Neural Bidirectional Texture Functions
View/ Open
Date
2021-11-23Author
Rainer, Gilles
Item/paper (currently) not available via TIB Hannover.
Metadata
Show full item recordAbstract
This thesis addresses the problem of photo-realistic rendering of real-world materials. Currently the most faithful approach to render an existing material is scanning the Bidirectional Reflectance Function (BTF), which relies on exhaustive acquisition of reflectance data from the material sample. This incurs heavy costs in terms of both capture times and memory requirements, meaning the main drawback is the lack of practicability.
The scope of this thesis is two-fold: implementation of a full BTF pipeline (data acquisition, processing and rendering) and design of a compact neural material representation.
We first present our custom BTF scanner, which uses a freely positionable camera and light source to acquire light- and view-dependent textures. During the processing phase, the textures are extracted from the images and rectified onto a unique grid using an estimated proxy surface. At rendering time, the rectification is reverted and the estimated height field additionally allows the preservation of material silhouettes.
The main part of the thesis is the development of a neural BTF model that is both compact in memory and practical for rendering. Concretely, the material is modeled by a small fully-connected neural network, parametrized on light and view directions as well as a vector of latent parameters that describe the appearance of the point. We first show that one network can efficiently learn to reproduce the appearance of one given material.
The second focus of our work is to find an efficient method to translate BTFs into our representation. Rather than training a new network instance for each new material, the latent space and network are shared, and we use an encoder network to quickly predict latent parameter networks for new, unseen materials.
All contributions are geared towards making photo-realistic rendering with BTFs more common and practicable in computer graphics applications like games and virtual environments.