Show simple item record

dc.contributor.authorKyriakou, Theodorosen_US
dc.contributor.authorAlvarez de la Campa Crespo, Merceen_US
dc.contributor.authorPanayiotou, Andreasen_US
dc.contributor.authorChrysanthou, Yiorgosen_US
dc.contributor.authorCharalambous, Panayiotisen_US
dc.contributor.authorAristidou, Andreasen_US
dc.contributor.editorAristidou, Andreasen_US
dc.contributor.editorMacdonnell, Rachelen_US
dc.date.accessioned2024-04-16T15:45:22Z
dc.date.available2024-04-16T15:45:22Z
dc.date.issued2024
dc.identifier.issn1467-8659
dc.identifier.urihttps://doi.org/10.1111/cgf.15065
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1111/cgf15065
dc.description.abstractDriven by recent advancements in Extended Reality (XR), the hype around the Metaverse, and real-time computer graphics, the transformation of the performing arts, particularly in digitizing and visualizing musical experiences, is an ever-evolving landscape. This transformation offers significant potential in promoting inclusivity, fostering creativity, and enabling live performances in diverse settings. However, despite its immense potential, the field of Virtual Instrument Performances (VIP) has remained relatively unexplored due to numerous challenges. These challenges arise from the complex and multi-modal nature of musical instrument performances, the need for high precision motion capture under occlusions including the intricate interactions between a musician's body and fingers with instruments, the precise synchronization and seamless integration of various sensory modalities, accommodating variations in musicians' playing styles, facial expressions, and addressing instrumentspecific nuances. This comprehensive survey delves into the intersection of technology, innovation, and artistic expression in the domain of virtual instrument performances. It explores musical performance multi-modal databases and investigates a wide range of data acquisition methods, encompassing diverse motion capture techniques, facial expression recording, and various approaches for capturing audio and MIDI data (Musical Instrument Digital Interface). The survey also explores Music Information Retrieval (MIR) tasks, with a particular emphasis on the Musical Performance Analysis (MPA) field, and offers an overview of various works in the realm of Musical Instrument Performance Synthesis (MIPS), encompassing recent advancements in generative models. The ultimate aim of this survey is to unveil the technological limitations, initiate a dialogue about the current challenges, and propose promising avenues for future research at the intersection of technology and the arts.en_US
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.rightsAttribution 4.0 International License
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectCCS Concepts: Computing methodologies → Animation; Motion capture; Motion processing; Machine learning
dc.subjectComputing methodologies → Animation
dc.subjectMotion capture
dc.subjectMotion processing
dc.subjectMachine learning
dc.titleVirtual Instrument Performances (VIP): A Comprehensive Reviewen_US
dc.description.seriesinformationComputer Graphics Forum
dc.description.sectionheadersState of the Art Reports
dc.description.volume43
dc.description.number2
dc.identifier.doi10.1111/cgf.15065
dc.identifier.pages29 pages
dc.description.documenttypestar


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International License
Except where otherwise noted, this item's license is described as Attribution 4.0 International License