dc.contributor.author | Smuc, Michael | en_US |
dc.contributor.author | Schreder, Günther | en_US |
dc.contributor.author | Mayr, Eva | en_US |
dc.contributor.author | Windhager, Florian | en_US |
dc.contributor.editor | W. Aigner and P. Rosenthal and C. Scheidegger | en_US |
dc.date.accessioned | 2015-05-24T19:39:52Z | |
dc.date.available | 2015-05-24T19:39:52Z | |
dc.date.issued | 2015 | en_US |
dc.identifier.uri | http://dx.doi.org/10.2312/eurorv3.20151148 | en_US |
dc.description.abstract | Especially in the field of Visual Analytics, where a lot of design decisions have to be taken, researchers strive for reproducible results. We present two different evaluation approaches aiming for more general design knowledge: the isolation of features and the abstraction of results. Both approaches have potentials, but also problems with respect to generating reproducible results. We discuss whether reproducibility is possible or even the right aim in the evaluation of Visual Analytics methods. | en_US |
dc.publisher | The Eurographics Association | en_US |
dc.subject | I.3.6 [Computer Graphics] | en_US |
dc.subject | Methodology and Techniques | en_US |
dc.subject | Standards | en_US |
dc.title | Should we Dream the Impossible Dream of Reproducibility in Visual Analytics Evaluation? | en_US |
dc.description.seriesinformation | EuroVis Workshop on Reproducibility, Verification, and Validation in Visualization (EuroRV3) | en_US |
dc.description.sectionheaders | Reproducibility in Visual Analytics | en_US |
dc.identifier.doi | 10.2312/eurorv3.20151148 | en_US |
dc.identifier.pages | 31-33 | en_US |