Show simple item record

dc.contributor.authorTexler, Ondřejen_US
dc.contributor.authorFišer, Jakuben_US
dc.contributor.authorLukáč, Mikeen_US
dc.contributor.authorLu, Jingwanen_US
dc.contributor.authorShechtman, Elien_US
dc.contributor.authorSýkora, Danielen_US
dc.contributor.editorKaplan, Craig S. and Forbes, Angus and DiVerdi, Stephenen_US
dc.date.accessioned2019-05-20T09:49:53Z
dc.date.available2019-05-20T09:49:53Z
dc.date.issued2019
dc.identifier.isbn978-3-03868-078-9
dc.identifier.urihttps://doi.org/10.2312/exp.20191075
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/exp20191075
dc.description.abstractWe present a new approach to example-based style transfer which combines neural methods with patch-based synthesis to achieve compelling stylization quality even for high-resolution imagery. We take advantage of neural techniques to provide adequate stylization at the global level and use their output as a prior for subsequent patch-based synthesis at the detail level. Thanks to this combination, our method keeps the high frequencies of the original artistic media better, thereby dramatically increases the fidelity of the resulting stylized imagery. We also show how to stylize extremely large images (e.g., 340 Mpix) without the need to run the synthesis at the pixel level, yet retaining the original high-frequency details.en_US
dc.publisherThe Eurographics Associationen_US
dc.subjectComputing methodologies
dc.subjectNon
dc.subjectphotorealistic rendering
dc.subjectImage processing
dc.titleEnhancing Neural Style Transfer using Patch-Based Synthesisen_US
dc.description.seriesinformationACM/EG Expressive Symposium
dc.description.sectionheadersLearned Styles
dc.identifier.doi10.2312/exp.20191075
dc.identifier.pages43-50


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record