dc.contributor.author | Zhu, Haichao | en_US |
dc.contributor.editor | Lee, Sung-Hee and Zollmann, Stefanie and Okabe, Makoto and Wünsche, Burkhard | en_US |
dc.date.accessioned | 2021-10-14T10:05:49Z | |
dc.date.available | 2021-10-14T10:05:49Z | |
dc.date.issued | 2021 | |
dc.identifier.isbn | 978-3-03868-162-5 | |
dc.identifier.uri | https://doi.org/10.2312/pg.20211399 | |
dc.identifier.uri | https://diglib.eg.org:443/handle/10.2312/pg20211399 | |
dc.description.abstract | Artistic style transfer synthesizes a stylized image with content from a target image and style from an art image. The latest neural style transfer leverages texture distributions as style information, and applies the style to content images afterwards. These methods are promising; however, they could introduce semantic content loss into synthesized results inevitably with the disregarded gradient information of input images. To tackle this problem, we propose a novel gradient-aware technique, called GANST. First, GANST decomposes input images to intermediate steerable representation that capture gradient information at multiple scales based on a Steerable Pyramid Neural Network (SPNN). With the extracted information, GANST preserves semantic content by integrating a novel loss representation of local gradients to AdaIN architecture, which we call Steerable Style Transfer Network (SSTN). Experimental results on various images demonstrate that our proposed GANST outperforms the state-of-the-art methods in producing results with concrete style reflected and detailed content preserved. | en_US |
dc.publisher | The Eurographics Association | en_US |
dc.subject | Computing methodologies | |
dc.subject | Neural networks | |
dc.title | GANST: Gradient-aware Arbitrary Neural Style Transfer | en_US |
dc.description.seriesinformation | Pacific Graphics Short Papers, Posters, and Work-in-Progress Papers | |
dc.description.sectionheaders | Image Processing and Synthesis | |
dc.identifier.doi | 10.2312/pg.20211399 | |
dc.identifier.pages | 93-98 | |