Show simple item record

dc.contributor.authorMishra, Shaileshen_US
dc.contributor.authorGranskog, Jonathanen_US
dc.contributor.editorBabaei, Vahiden_US
dc.contributor.editorSkouras, Melinaen_US
dc.date.accessioned2023-05-03T06:02:53Z
dc.date.available2023-05-03T06:02:53Z
dc.date.issued2023
dc.identifier.isbn978-3-03868-209-7
dc.identifier.issn1017-4656
dc.identifier.urihttps://doi.org/10.2312/egs.20231006
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/egs20231006
dc.description.abstractWe present a method for transferring the style from a set of images to the texture of a 3D object. The texture of an asset is optimized with a differentiable renderer and losses using pretrained deep neural networks. More specifically, we utilize a nearest-neighbor feature matching (NNFM) loss with CLIP-ResNet50 that we extend to support multiple style images. We improve color accuracy and artistic control with an extra loss on user-provided or automatically extracted color palettes. Finally, we show that a CLIP-based NNFM loss provides a different appearance over a VGG-based one by focusing more on textural details over geometric shapes. However, we note that user preference is still subjective.en_US
dc.publisherThe Eurographics Associationen_US
dc.rightsAttribution 4.0 International License
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectCCS Concepts: Computing methodologies → Appearance and texture representations; Rasterization; Supervised learning by regression
dc.subjectComputing methodologies → Appearance and texture representations
dc.subjectRasterization
dc.subjectSupervised learning by regression
dc.titleCLIP-based Neural Neighbor Style Transfer for 3D Assetsen_US
dc.description.seriesinformationEurographics 2023 - Short Papers
dc.description.sectionheadersStylization and Point Clouds
dc.identifier.doi10.2312/egs.20231006
dc.identifier.pages25-28
dc.identifier.pages4 pages


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International License
Except where otherwise noted, this item's license is described as Attribution 4.0 International License