File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Towards natural and accurate future motion prediction of humans and animals

TitleTowards natural and accurate future motion prediction of humans and animals
Authors
KeywordsDeep Learning
Motion and Tracking
Issue Date2019
Citation
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019, v. 2019-June, p. 9996-10004 How to Cite?
AbstractAnticipating the future motions of 3D articulate objects is challenging due to its non-linear and highly stochastic nature. Current approaches typically represent the skeleton of an articulate object as a set of 3D joints, which unfortunately ignores the relationship between joints, and fails to encode fine-grained anatomical constraints. Moreover, conventional recurrent neural networks, such as LSTM and GRU, are employed to model motion contexts, which inherently have difficulties in capturing long-term dependencies. To address these problems, we propose to explicitly encode anatomical constraints by modeling their skeletons with a Lie algebra representation. Importantly, a hierarchical recurrent network structure is developed to simultaneously encodes local contexts of individual frames and global contexts of the sequence. We proceed to explore the applications of our approach to several distinct quantities including human, fish, and mouse. Extensive experiments show that our approach achieves more natural and accurate predictions over state-of-the-art methods.
Persistent Identifierhttp://hdl.handle.net/10722/321875
ISSN
2023 SCImago Journal Rankings: 10.331
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorLiu, Zhenguang-
dc.contributor.authorWu, Shuang-
dc.contributor.authorJin, Shuyuan-
dc.contributor.authorLiu, Qi-
dc.contributor.authorLu, Shijian-
dc.contributor.authorZimmermann, Roger-
dc.contributor.authorCheng, Li-
dc.date.accessioned2022-11-03T02:22:03Z-
dc.date.available2022-11-03T02:22:03Z-
dc.date.issued2019-
dc.identifier.citationProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019, v. 2019-June, p. 9996-10004-
dc.identifier.issn1063-6919-
dc.identifier.urihttp://hdl.handle.net/10722/321875-
dc.description.abstractAnticipating the future motions of 3D articulate objects is challenging due to its non-linear and highly stochastic nature. Current approaches typically represent the skeleton of an articulate object as a set of 3D joints, which unfortunately ignores the relationship between joints, and fails to encode fine-grained anatomical constraints. Moreover, conventional recurrent neural networks, such as LSTM and GRU, are employed to model motion contexts, which inherently have difficulties in capturing long-term dependencies. To address these problems, we propose to explicitly encode anatomical constraints by modeling their skeletons with a Lie algebra representation. Importantly, a hierarchical recurrent network structure is developed to simultaneously encodes local contexts of individual frames and global contexts of the sequence. We proceed to explore the applications of our approach to several distinct quantities including human, fish, and mouse. Extensive experiments show that our approach achieves more natural and accurate predictions over state-of-the-art methods.-
dc.languageeng-
dc.relation.ispartofProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition-
dc.subjectDeep Learning-
dc.subjectMotion and Tracking-
dc.titleTowards natural and accurate future motion prediction of humans and animals-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/CVPR.2019.01024-
dc.identifier.scopuseid_2-s2.0-85078724907-
dc.identifier.volume2019-June-
dc.identifier.spage9996-
dc.identifier.epage10004-
dc.identifier.isiWOS:000542649303062-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats