dc.contributor.author | Kim, Jaedong | en_US |
dc.contributor.author | Seo, Hyunggoog | en_US |
dc.contributor.author | Cha, Seunghoon | en_US |
dc.contributor.author | Noh, Junyong | en_US |
dc.contributor.editor | Chen, Min and Benes, Bedrich | en_US |
dc.date.accessioned | 2019-03-17T09:57:00Z | |
dc.date.available | 2019-03-17T09:57:00Z | |
dc.date.issued | 2019 | |
dc.identifier.issn | 1467-8659 | |
dc.identifier.uri | https://doi.org/10.1111/cgf.13541 | |
dc.identifier.uri | https://diglib.eg.org:443/handle/10.1111/cgf13541 | |
dc.description.abstract | When a person is located between a display and an operating projector, a shadow is cast on the display. The shadow on the display may eliminate important visual information and therefore adversely affect the viewing experiences. There have been various attempts to remove the human shadow cast on a projection display by using multiple projectors. While previous approaches successfully removed the shadow region when a person moderately moves around or stands stationary in front of the display, there is still an afterimage effect due to the lack of consideration of the limb motion of the person. We propose a new real‐time approach to removing the shadow cast by a person who dynamically interacts with the display, making limb motions in a front projection system. The proposed method utilizes a human skeleton obtained from a depth camera to track the posture of the person which changes over time. A model that consists of spheres and conical frustums is constructed based on the skeleton information in order to represent volumetric information of the person being tracked. Our method precisely estimates the shadow region by projecting the volumetric model onto the display. In addition, employment of intensity masks that are built based on a distance field helps suppress the afterimage of the shadow that appears when the person moves abruptly. It also helps blend the projected overlapping images from different projectors and show one smoothly combined display. The experiment results verify that our approach removes the shadow of a person effectively in a front projection environment and is fast enough to achieve real‐time performance. | en_US |
dc.publisher | © 2019 The Eurographics Association and John Wiley & Sons Ltd. | en_US |
dc.subject | image processing | |
dc.subject | image and video processing | |
dc.subject | projected displays | |
dc.subject | hardware | |
dc.subject | immersive VR | |
dc.subject | virtual environments | |
dc.title | Real‐Time Human Shadow Removal in a Front Projection System | en_US |
dc.description.seriesinformation | Computer Graphics Forum | |
dc.description.sectionheaders | Articles | |
dc.description.volume | 38 | |
dc.description.number | 1 | |
dc.identifier.doi | 10.1111/cgf.13541 | |
dc.identifier.pages | 443-454 | |