Photo-Guided Exploration of Volume Data Features
Abstract
In this work, we pose the question of whether, by considering qualitative information such as a sample target image as input, one can produce a rendered image of scientific data that is similar to the target. The algorithm resulting from our research allows one to ask the question of whether features like those in the target image exists in a given dataset. In that way, our method is one of imagery query or reverse engineering, as opposed to manual parameter tweaking of the full visualization pipeline. For target images, we can use real-world photographs of physical phenomena. Our method leverages deep neural networks and evolutionary optimization. Using a trained similarity function that measures the difference between renderings of a phenomenon and real-world photographs, our method optimizes rendering parameters. We demonstrate the efficacy of our method using a superstorm simulation dataset and images found online. We also discuss a parallel implementation of our method, which was run on NCSA's Blue Waters.
BibTeX
@inproceedings {10.2312:pgv.20171091,
booktitle = {Eurographics Symposium on Parallel Graphics and Visualization},
editor = {Alexandru Telea and Janine Bennett},
title = {{Photo-Guided Exploration of Volume Data Features}},
author = {Raji, Mohammad and Hota, Alok and Sisneros, Robert and Messmer, Peter and Huang, Jian},
year = {2017},
publisher = {The Eurographics Association},
ISSN = {1727-348X},
ISBN = {978-3-03868-034-5},
DOI = {10.2312/pgv.20171091}
}
booktitle = {Eurographics Symposium on Parallel Graphics and Visualization},
editor = {Alexandru Telea and Janine Bennett},
title = {{Photo-Guided Exploration of Volume Data Features}},
author = {Raji, Mohammad and Hota, Alok and Sisneros, Robert and Messmer, Peter and Huang, Jian},
year = {2017},
publisher = {The Eurographics Association},
ISSN = {1727-348X},
ISBN = {978-3-03868-034-5},
DOI = {10.2312/pgv.20171091}
}