Show simple item record

dc.contributor.authorLi, Ziweien_US
dc.contributor.authorXu, Jiayien_US
dc.contributor.authorChao, Wei-Lunen_US
dc.contributor.authorShen, Han-Weien_US
dc.contributor.editorBujack, Roxanaen_US
dc.contributor.editorArchambault, Danielen_US
dc.contributor.editorSchreck, Tobiasen_US
dc.date.accessioned2023-06-10T06:17:35Z
dc.date.available2023-06-10T06:17:35Z
dc.date.issued2023
dc.identifier.issn1467-8659
dc.identifier.urihttps://doi.org/10.1111/cgf.14842
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1111/cgf14842
dc.description.abstractTask-incremental learning (Task-IL) aims to enable an intelligent agent to continuously accumulate knowledge from new learning tasks without catastrophically forgetting what it has learned in the past. It has drawn increasing attention in recent years, with many algorithms being proposed to mitigate neural network forgetting. However, none of the existing strategies is able to completely eliminate the issues. Moreover, explaining and fully understanding what knowledge and how it is being forgotten during the incremental learning process still remains under-explored. In this paper, we propose KnowledgeDrift, a visual analytics framework, to interpret the network forgetting with three objectives: (1) to identify when the network fails to memorize the past knowledge, (2) to visualize what information has been forgotten, and (3) to diagnose how knowledge attained in the new model interferes with the one learned in the past. Our analytical framework first identifies the occurrence of forgetting by tracking the task performance under the incremental learning process and then provides in-depth inspections of drifted information via various levels of data granularity. KnowledgeDrift allows analysts and model developers to enhance their understanding of network forgetting and compare the performance of different incremental learning algorithms. Three case studies are conducted in the paper to further provide insights and guidance for users to effectively diagnose catastrophic forgetting over time.en_US
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.rightsAttribution 4.0 International License
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectCCS Concepts: Computing methodologies -> Visual analytics; Theory of computation -> Continual learning
dc.subjectComputing methodologies
dc.subjectVisual analytics
dc.subjectTheory of computation
dc.subjectContinual learning
dc.titleVisual Analytics on Network Forgetting for Task-Incremental Learningen_US
dc.description.seriesinformationComputer Graphics Forum
dc.description.sectionheadersVisualization and Machine Learning
dc.description.volume42
dc.description.number3
dc.identifier.doi10.1111/cgf.14842
dc.identifier.pages437-448
dc.identifier.pages12 pages


Files in this item

Thumbnail

This item appears in the following Collection(s)

  • 42-Issue 3
    EuroVis 2023 - Conference Proceedings

Show simple item record

Attribution 4.0 International License
Except where otherwise noted, this item's license is described as Attribution 4.0 International License