Show simple item record

dc.contributor.authorTojo, Kenjien_US
dc.contributor.authorChen, Yifeien_US
dc.contributor.authorUmetani, Nobuyukien_US
dc.contributor.editorPelechano, Nuriaen_US
dc.contributor.editorVanderhaeghe, Daviden_US
dc.date.accessioned2022-04-22T08:16:15Z
dc.date.available2022-04-22T08:16:15Z
dc.date.issued2022
dc.identifier.isbn978-3-03868-169-4
dc.identifier.issn1017-4656
dc.identifier.urihttps://doi.org/10.2312/egs.20221033
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/egs20221033
dc.description.abstractWe present a neural-network-based compression method to alleviate the storage cost of motion capture data. Human motions such as locomotion, often consist of periodic movements. We leverage this periodicity by applying Fourier features to a multilayered perceptron network. Our novel algorithm finds a set of Fourier feature frequencies based on the discrete cosine transformation (DCT) of motion. During training, we incrementally added a dominant frequency of the DCT to a current set of Fourier feature frequencies until a given quality threshold was satisfied. We conducted an experiment using CMU motion dataset, and the results suggest that our method achieves overall high compression ratio while maintaining its quality.en_US
dc.publisherThe Eurographics Associationen_US
dc.rightsAttribution 4.0 International License
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectCCS Concepts: Computing methodologies --> Animation; Neural networks
dc.subjectComputing methodologies
dc.subjectAnimation
dc.subjectNeural networks
dc.titleNeural Motion Compression with Frequency-adaptive Fourier Feature Networken_US
dc.description.seriesinformationEurographics 2022 - Short Papers
dc.description.sectionheadersLearning
dc.identifier.doi10.2312/egs.20221033
dc.identifier.pages61-64
dc.identifier.pages4 pages


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International License
Except where otherwise noted, this item's license is described as Attribution 4.0 International License