Show simple item record

dc.contributor.authorShekhar, Sumiten_US
dc.contributor.authorReimann, Maxen_US
dc.contributor.authorHilscher, Moritzen_US
dc.contributor.authorSemmo, Amiren_US
dc.contributor.authorDöllner, Jürgenen_US
dc.contributor.authorTrapp, Matthiasen_US
dc.contributor.editorRitschel, Tobiasen_US
dc.contributor.editorWeidlich, Andreaen_US
dc.date.accessioned2023-06-27T07:04:00Z
dc.date.available2023-06-27T07:04:00Z
dc.date.issued2023
dc.identifier.issn1467-8659
dc.identifier.urihttps://doi.org/10.1111/cgf.14891
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1111/cgf14891
dc.description.abstractImage stylization has seen significant advancement and widespread interest over the years, leading to the development of a multitude of techniques. Extending these stylization techniques, such as Neural Style Transfer (NST), to videos is often achieved by applying them on a per-frame basis. However, per-frame stylization usually lacks temporal consistency, expressed by undesirable flickering artifacts. Most of the existing approaches for enforcing temporal consistency suffer from one or more of the following drawbacks: They (1) are only suitable for a limited range of techniques, (2) do not support online processing as they require the complete video as input, (3) cannot provide consistency for the task of stylization, or (4) do not provide interactive consistency control. Domain-agnostic techniques for temporal consistency aim to eradicate flickering completely but typically disregard aesthetic aspects. For stylization tasks, however, consistency control is an essential requirement as a certain amount of flickering adds to the artistic look and feel. Moreover, making this control interactive is paramount from a usability perspective. To achieve the above requirements, we propose an approach that stylizes video streams in real-time at full HD resolutions while providing interactive consistency control. We develop a lite optical-flow network that operates at 80 Frames per second (FPS) on desktop systems with sufficient accuracy. Further, we employ an adaptive combination of local and global consistency features and enable interactive selection between them. Objective and subjective evaluations demonstrate that our method is superior to state-of-the-art video consistency approaches. maxreimann.github.io/stream-consistencyen_US
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.subjectCCS Concepts: Computing methodologies -> Image-based rendering; Non-photorealistic rendering; Image processing
dc.subjectComputing methodologies
dc.subjectImage
dc.subjectbased rendering
dc.subjectNon
dc.subjectphotorealistic rendering
dc.subjectImage processing
dc.titleInteractive Control over Temporal Consistency while Stylizing Video Streamsen_US
dc.description.seriesinformationComputer Graphics Forum
dc.description.sectionheadersVideo and Editing
dc.description.volume42
dc.description.number4
dc.identifier.doi10.1111/cgf.14891
dc.identifier.pages14 pages


Files in this item

Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

  • 42-Issue 4
    Rendering 2023 - Symposium Proceedings

Show simple item record