Show simple item record

dc.contributor.authorKo, Beom-Seoken_US
dc.contributor.authorKang, Ho-Sanen_US
dc.contributor.authorLee, Kyuhongen_US
dc.contributor.authorBraunschweiler, Manuelen_US
dc.contributor.authorZünd, Fabioen_US
dc.contributor.authorSumner, Robert W.en_US
dc.contributor.authorChoi, Soo-Mien_US
dc.contributor.editorChaine, Raphaëlleen_US
dc.contributor.editorDeng, Zhigangen_US
dc.contributor.editorKim, Min H.en_US
dc.date.accessioned2023-10-09T07:42:56Z
dc.date.available2023-10-09T07:42:56Z
dc.date.issued2023
dc.identifier.isbn978-3-03868-234-9
dc.identifier.urihttps://doi.org/10.2312/pg.20231286
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/pg20231286
dc.description.abstractThis paper presents a novel interaction approach based on a user's emotions within augmented reality (AR) and virtual reality (VR) environments to achieve immersive interaction with virtual intelligent characters. To identify the user's emotions through voice, the Google Speech-to-Text API is used to transcribe speech and then the RoBERTa language processing model is utilized to classify emotions. In AR environment, the intelligent character can change the styles and properties of objects based on the recognized user's emotions during a dialog. On the other side, in VR environment, the movement of the user's eyes and lower face is tracked by VIVE Pro Eye and Facial Tracker, and EmotionNet is used for emotion recognition. Then, the virtual environment can be changed based on the recognized user's emotions. Our findings present an interesting idea for integrating emotionally intelligent characters in AR/VR using generative AI and facial expression recognition.en_US
dc.publisherThe Eurographics Associationen_US
dc.rightsAttribution 4.0 International License
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectCCS Concepts: Human-centered computing -> Human computer interaction (HCI); Hardware -> VIVE Pro Eye; Facial Tracker
dc.subjectHuman centered computing
dc.subjectHuman computer interaction (HCI)
dc.subjectHardware
dc.subjectVIVE Pro Eye
dc.subjectFacial Tracker
dc.titleEmotion-based Interaction Technique Using User's Voice and Facial Expressions in Virtual and Augmented Realityen_US
dc.description.seriesinformationPacific Graphics Short Papers and Posters
dc.description.sectionheadersPosters
dc.identifier.doi10.2312/pg.20231286
dc.identifier.pages121-122
dc.identifier.pages2 pages


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International License
Except where otherwise noted, this item's license is described as Attribution 4.0 International License