File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: ManipNet: neural manipulation synthesis with a hand-object spatial representation

TitleManipNet: neural manipulation synthesis with a hand-object spatial representation
Authors
Issue Date2021
PublisherAssociation for Computing Machinery, Inc. The Journal's web site is located at http://tog.acm.org
Citation
ACM Transactions on Graphics, 2021, v. 40 n. 4, p. article no. 121 How to Cite?
AbstractNatural hand manipulations exhibit complex finger maneuvers adaptive to object shapes and the tasks at hand. Learning dexterous manipulation from data in a brute force way would require a prohibitive amount of examples to effectively cover the combinatorial space of 3D shapes and activities. In this paper, we propose a hand-object spatial representation that can achieve generalization from limited data. Our representation combines the global object shape as voxel occupancies with local geometric details as samples of closest distances. This representation is used by a neural network to regress finger motions from input trajectories of wrists and objects. Specifically, we provide the network with the current finger pose, past and future trajectories, and the spatial representations extracted from these trajectories. The network then predicts a new finger pose for the next frame as an autoregressive model. With a carefully chosen hand-centric coordinate system, we can handle single-handed and two-handed motions in a unified framework. Learning from a small number of primitive shapes and kitchenware objects, the network is able to synthesize a variety of finger gaits for grasping, in-hand manipulation, and bimanual object handling on a rich set of novel shapes and functional tasks. We also demonstrate a live demo of manipulating virtual objects in real-time using a simple physical prop. Our system is useful for offline animation or real-time applications forgiving to a small delay.
Persistent Identifierhttp://hdl.handle.net/10722/304080
ISSN
2021 Impact Factor: 7.403
2020 SCImago Journal Rankings: 2.153
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorZhang, H-
dc.contributor.authorYe, Y-
dc.contributor.authorShiratori, T-
dc.contributor.authorKomura, T-
dc.date.accessioned2021-09-23T08:54:57Z-
dc.date.available2021-09-23T08:54:57Z-
dc.date.issued2021-
dc.identifier.citationACM Transactions on Graphics, 2021, v. 40 n. 4, p. article no. 121-
dc.identifier.issn0730-0301-
dc.identifier.urihttp://hdl.handle.net/10722/304080-
dc.description.abstractNatural hand manipulations exhibit complex finger maneuvers adaptive to object shapes and the tasks at hand. Learning dexterous manipulation from data in a brute force way would require a prohibitive amount of examples to effectively cover the combinatorial space of 3D shapes and activities. In this paper, we propose a hand-object spatial representation that can achieve generalization from limited data. Our representation combines the global object shape as voxel occupancies with local geometric details as samples of closest distances. This representation is used by a neural network to regress finger motions from input trajectories of wrists and objects. Specifically, we provide the network with the current finger pose, past and future trajectories, and the spatial representations extracted from these trajectories. The network then predicts a new finger pose for the next frame as an autoregressive model. With a carefully chosen hand-centric coordinate system, we can handle single-handed and two-handed motions in a unified framework. Learning from a small number of primitive shapes and kitchenware objects, the network is able to synthesize a variety of finger gaits for grasping, in-hand manipulation, and bimanual object handling on a rich set of novel shapes and functional tasks. We also demonstrate a live demo of manipulating virtual objects in real-time using a simple physical prop. Our system is useful for offline animation or real-time applications forgiving to a small delay.-
dc.languageeng-
dc.publisherAssociation for Computing Machinery, Inc. The Journal's web site is located at http://tog.acm.org-
dc.relation.ispartofACM Transactions on Graphics-
dc.rightsACM Transactions on Graphics. Copyright © Association for Computing Machinery, Inc.-
dc.rights©ACM, YYYY. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in PUBLICATION, {VOL#, ISS#, (DATE)} http://doi.acm.org/10.1145/nnnnnn.nnnnnn-
dc.titleManipNet: neural manipulation synthesis with a hand-object spatial representation-
dc.typeArticle-
dc.identifier.emailKomura, T: taku@cs.hku.hk-
dc.identifier.authorityKomura, T=rp02741-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1145/3450626.3459830-
dc.identifier.scopuseid_2-s2.0-85111272754-
dc.identifier.hkuros325507-
dc.identifier.volume40-
dc.identifier.issue4-
dc.identifier.spagearticle no. 121-
dc.identifier.epagearticle no. 121-
dc.identifier.isiWOS:000674930900086-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats