Show simple item record

dc.contributor.authorCantareira, Gabriel Diasen_US
dc.contributor.authorPaulovich, Fernando V.en_US
dc.contributor.editorTurkay, Cagatay and Vrotsou, Katerinaen_US
dc.date.accessioned2020-05-24T13:31:31Z
dc.date.available2020-05-24T13:31:31Z
dc.date.issued2020
dc.identifier.isbn978-3-03868-116-8
dc.identifier.issn2664-4487
dc.identifier.urihttps://doi.org/10.2312/eurova.20201089
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/eurova20201089
dc.description.abstractDimensionality reduction techniques are popular tools for the visualization of neural network models due to their ability to display hidden layer activations and aiding the understanding of how abstract representations are being formed. However, many techniques render poor results when used to compare multiple projections resulted from different feature sets, such as the outputs of different hidden layers or the outputs from different models processing the same data. This problem occurs due to the lack of an alignment factor to ensure that visual differences represent actual differences between the feature sets and not artifacts generated by the technique. In this paper, we propose a generic model to align multiple projections when visualizing different feature sets that can be applied to any gradient descent-based dimensionality reduction technique. We employ this model to generate a variant of the UMAP method and show the results of its application.en_US
dc.publisherThe Eurographics Associationen_US
dc.rightsAttribution 4.0 International License
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/]
dc.titleA Generic Model for Projection Alignment Applied to Neural Network Visualizationen_US
dc.description.seriesinformationEuroVis Workshop on Visual Analytics (EuroVA)
dc.description.sectionheadersIntersecting Humans and AI
dc.identifier.doi10.2312/eurova.20201089
dc.identifier.pages67-71


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International License
Except where otherwise noted, this item's license is described as Attribution 4.0 International License