Show simple item record

dc.contributor.authorSmuc, Michaelen_US
dc.contributor.authorSchreder, Güntheren_US
dc.contributor.authorMayr, Evaen_US
dc.contributor.authorWindhager, Florianen_US
dc.contributor.editorW. Aigner and P. Rosenthal and C. Scheideggeren_US
dc.date.accessioned2015-05-24T19:39:52Z
dc.date.available2015-05-24T19:39:52Z
dc.date.issued2015en_US
dc.identifier.urihttp://dx.doi.org/10.2312/eurorv3.20151148en_US
dc.description.abstractEspecially in the field of Visual Analytics, where a lot of design decisions have to be taken, researchers strive for reproducible results. We present two different evaluation approaches aiming for more general design knowledge: the isolation of features and the abstraction of results. Both approaches have potentials, but also problems with respect to generating reproducible results. We discuss whether reproducibility is possible or even the right aim in the evaluation of Visual Analytics methods.en_US
dc.publisherThe Eurographics Associationen_US
dc.subjectI.3.6 [Computer Graphics]en_US
dc.subjectMethodology and Techniquesen_US
dc.subjectStandardsen_US
dc.titleShould we Dream the Impossible Dream of Reproducibility in Visual Analytics Evaluation?en_US
dc.description.seriesinformationEuroVis Workshop on Reproducibility, Verification, and Validation in Visualization (EuroRV3)en_US
dc.description.sectionheadersReproducibility in Visual Analyticsen_US
dc.identifier.doi10.2312/eurorv3.20151148en_US
dc.identifier.pages31-33en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record