dc.contributor.author | Elshehaly, Mai | en_US |
dc.contributor.author | Gracanin, Denis | en_US |
dc.contributor.author | Gad, Mohamed | en_US |
dc.contributor.author | Elmongui, Hicham G. | en_US |
dc.contributor.author | Matkovic, Kresimir | en_US |
dc.contributor.editor | H. Carr, K.-L. Ma, and G. Santucci | en_US |
dc.date.accessioned | 2015-05-22T12:51:30Z | |
dc.date.available | 2015-05-22T12:51:30Z | |
dc.date.issued | 2015 | en_US |
dc.identifier.uri | http://dx.doi.org/10.1111/cgf.12637 | en_US |
dc.description.abstract | Scientific data acquired through sensors which monitor natural phenomena, as well as simulation data that imitate time-identified events, have fueled the need for interactive techniques to successfully analyze and understand trends and patterns across space and time. We present a novel interactive visualization technique that fuses ground truth measurements with simulation results in real-time to support the continuous tracking and analysis of spatiotemporal patterns. We start by constructing a reference model which densely represents the expected temporal behavior, and then use GPU parallelism to advect measurements on the model and track their location at any given point in time. Our results show that users can interactively fill the spatio-temporal gaps in real world observations, and generate animations that accurately describe physical phenomena. | en_US |
dc.publisher | The Eurographics Association and John Wiley & Sons Ltd. | en_US |
dc.subject | I.3.6 [Computer Graphics] | en_US |
dc.subject | Methodology and Techniques | en_US |
dc.subject | Interaction techniques | en_US |
dc.title | Interactive Fusion and Tracking For Multi-Modal Spatial Data Visualization | en_US |
dc.description.seriesinformation | Computer Graphics Forum | en_US |
dc.description.sectionheaders | Multi-modal and Multi-field | en_US |
dc.description.volume | 34 | en_US |
dc.description.number | 3 | en_US |
dc.identifier.doi | 10.1111/cgf.12637 | en_US |
dc.identifier.pages | 251-260 | en_US |