dc.contributor.author | Turton, Terece L. | en_US |
dc.contributor.author | Ware, Colin | en_US |
dc.contributor.author | Samsel, Francesca | en_US |
dc.contributor.author | Rogers, David H. | en_US |
dc.contributor.editor | Kai Lawonn and Noeska Smit and Douglas Cunningham | en_US |
dc.date.accessioned | 2017-06-12T05:15:27Z | |
dc.date.available | 2017-06-12T05:15:27Z | |
dc.date.issued | 2017 | |
dc.identifier.isbn | 978-3-03868-041-3 | |
dc.identifier.uri | http://dx.doi.org/10.2312/eurorv3.20171106 | |
dc.identifier.uri | https://diglib.eg.org:443/handle/10.2312/eurorv320171106 | |
dc.description.abstract | Despite continual research and discussion on the perceptual effects of color in scientific visualization, psychophysical testing is often limited. In-person lab studies can be expensive and time-consuming while results can be difficult to extrapolate from meticulously controlled laboratory conditions to the real world of the visualization user. We draw on lessons learned from the use of crowdsourced participant pools in the behavioral sciences and information visualization to apply a crowdsourced approach to a classic psychophysical experiment assessing the ability of a colormap to impart metric information. We use an online presentation analogous to the color key task from Ware's 1988 paper, Color Sequences for Univariate Maps, testing colormaps similar to those in the original paper along with contemporary colormap standards and new alternatives in the scientific visualization domain. We explore the issue of potential contamination from color deficient participants and establish that perceptual color research can appropriately leverage a crowdsourced participant pool without significant CVD concerns. The updated version of the Ware color key task also provides a method to assess and compare colormaps. | en_US |
dc.publisher | The Eurographics Association | en_US |
dc.subject | H.1.2 [Models and Principles] | |
dc.subject | User/Machine Systems | |
dc.subject | Human Factors H.5.2 [Information Systems] | |
dc.subject | User Interfaces | |
dc.subject | Evaluation/methodology H.m [User/Machine Systems] | |
dc.subject | Miscellaneous | |
dc.subject | Colormapping | |
dc.title | A Crowdsourced Approach to Colormap Assessment | en_US |
dc.description.seriesinformation | EuroVis Workshop on Reproducibility, Verification, and Validation in Visualization (EuroRV3) | |
dc.description.sectionheaders | Perceptual Experiments and Insights | |
dc.identifier.doi | 10.2312/eurorv3.20171106 | |
dc.identifier.pages | 1-5 | |