Show simple item record

dc.contributor.authorLau, Manfreden_US
dc.contributor.authorDev, Kapilen_US
dc.contributor.editorCagatay Turkay and Tao Ruan Wanen_US
dc.date.accessioned2016-09-15T09:05:57Z
dc.date.available2016-09-15T09:05:57Z
dc.date.issued2016
dc.identifier.isbn978-3-03868-022-2
dc.identifier.issn-
dc.identifier.urihttp://dx.doi.org/10.2312/cgvc.20161302
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/cgvc20161302
dc.description.abstractThis work has previously been published [LDS 16] and this extended abstract provides a synopsis for further discussion at the UK CGVC 2016 conference. We introduce the concept of tactile mesh saliency, where tactile salient points on a virtual mesh are those that a human is more likely to grasp, press, or touch if the mesh were a real-world object. We solve the problem of taking as input a 3D mesh and computing the tactile saliency of every mesh vertex. The key to solving this problem is in a new formulation that combines deep learning and learning-to-rank methods to compute a tactile saliency measure. Finally, we discuss possibilities for future work.en_US
dc.publisherThe Eurographics Associationen_US
dc.subjectI.3.5 [Computer Graphics]
dc.subjectComputational Geometry and Object Modeling
dc.subjectModeling Packages
dc.titleTactile Mesh Saliency: A Brief Synopsisen_US
dc.description.seriesinformationComputer Graphics and Visual Computing (CGVC)
dc.description.sectionheadersGeometry and Surfaces
dc.identifier.doi10.2312/cgvc.20161302
dc.identifier.pages95-96


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record