Show simple item record

dc.contributor.authorZinat, Kazi Tasnimen_US
dc.contributor.authorYang, Jinhuaen_US
dc.contributor.authorGandhi, Arjunen_US
dc.contributor.authorMitra, Nisthaen_US
dc.contributor.authorLiu, Zhichengen_US
dc.contributor.editorBujack, Roxanaen_US
dc.contributor.editorArchambault, Danielen_US
dc.contributor.editorSchreck, Tobiasen_US
dc.date.accessioned2023-06-10T06:16:41Z
dc.date.available2023-06-10T06:16:41Z
dc.date.issued2023
dc.identifier.issn1467-8659
dc.identifier.urihttps://doi.org/10.1111/cgf.14821
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1111/cgf14821
dc.description.abstractReal-world event sequences are often complex and heterogeneous, making it difficult to create meaningful visualizations using simple data aggregation and visual encoding techniques. Consequently, visualization researchers have developed numerous visual summarization techniques to generate concise overviews of sequential data. These techniques vary widely in terms of summary structures and contents, and currently there is a knowledge gap in understanding the effectiveness of these techniques. In this work, we present the design and results of an insight-based crowdsourcing experiment evaluating three existing visual summarization techniques: CoreFlow, SentenTree, and Sequence Synopsis. We compare the visual summaries generated by these techniques across three tasks, on six datasets, at six levels of granularity. We analyze the effects of these variables on summary quality as rated by participants and completion time of the experiment tasks. Our analysis shows that Sequence Synopsis produces the highest-quality visual summaries for all three tasks, but understanding Sequence Synopsis results also takes the longest time. We also find that the participants evaluate visual summary quality based on two aspects: content and interpretability. We discuss the implications of our findings on developing and evaluating new visual summarization techniques.en_US
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.subjectCCS Concepts: Human-centered computing -> Visualization design and evaluation methods; Empirical studies in visualization
dc.subjectHuman centered computing
dc.subjectVisualization design and evaluation methods
dc.subjectEmpirical studies in visualization
dc.titleA Comparative Evaluation of Visual Summarization Techniques for Event Sequencesen_US
dc.description.seriesinformationComputer Graphics Forum
dc.description.sectionheadersVisualization Techniques I: Sequences and High-dimensional Data
dc.description.volume42
dc.description.number3
dc.identifier.doi10.1111/cgf.14821
dc.identifier.pages173-185
dc.identifier.pages13 pages


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

  • 42-Issue 3
    EuroVis 2023 - Conference Proceedings

Show simple item record