File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: A hybrid deep architecture for robotic grasp detection

TitleA hybrid deep architecture for robotic grasp detection
Authors
Issue Date2017
PublisherIEEE, Computer Society. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000639
Citation
Proceedings of 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May-3 June 2017, p. 1609-1614 How to Cite?
AbstractThe robotic grasp detection is a great challenge in the area of robotics. Previous work mainly employs the visual approaches to solve this problem. In this paper, a hybrid deep architecture combining the visual and tactile sensing for robotic grasp detection is proposed. We have demonstrated that the visual sensing and tactile sensing are complementary to each other and important for the robotic grasping. A new THU grasp dataset has also been collected which contains the visual, tactile and grasp configuration information. The experiments conducted on a public grasp dataset and our collected dataset show that the performance of the proposed model is superior to state of the art methods. The results also indicate that the tactile data could help to enable the network to learn better visual features for the robotic grasp detection task.
Persistent Identifierhttp://hdl.handle.net/10722/261615
ISSN
2020 SCImago Journal Rankings: 0.915

 

DC FieldValueLanguage
dc.contributor.authorGuo, D-
dc.contributor.authorSun, F-
dc.contributor.authorLiu, H-
dc.contributor.authorKong, T-
dc.contributor.authorFang, B-
dc.contributor.authorXi, N-
dc.date.accessioned2018-09-28T04:44:43Z-
dc.date.available2018-09-28T04:44:43Z-
dc.date.issued2017-
dc.identifier.citationProceedings of 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May-3 June 2017, p. 1609-1614-
dc.identifier.issn1050-4729-
dc.identifier.urihttp://hdl.handle.net/10722/261615-
dc.description.abstractThe robotic grasp detection is a great challenge in the area of robotics. Previous work mainly employs the visual approaches to solve this problem. In this paper, a hybrid deep architecture combining the visual and tactile sensing for robotic grasp detection is proposed. We have demonstrated that the visual sensing and tactile sensing are complementary to each other and important for the robotic grasping. A new THU grasp dataset has also been collected which contains the visual, tactile and grasp configuration information. The experiments conducted on a public grasp dataset and our collected dataset show that the performance of the proposed model is superior to state of the art methods. The results also indicate that the tactile data could help to enable the network to learn better visual features for the robotic grasp detection task.-
dc.languageeng-
dc.publisherIEEE, Computer Society. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000639-
dc.relation.ispartofIEEE International Conference on Robotics and Automation-
dc.rightsIEEE International Conference on Robotics and Automation. Copyright © IEEE, Computer Society.-
dc.titleA hybrid deep architecture for robotic grasp detection-
dc.typeConference_Paper-
dc.identifier.emailXi, N: xining@hku.hk-
dc.identifier.authorityXi, N=rp02044-
dc.identifier.doi10.1109/ICRA.2017.7989191-
dc.identifier.scopuseid_2-s2.0-85024905678-
dc.identifier.hkuros292796-
dc.identifier.spage1609-
dc.identifier.epage1614-
dc.publisher.placeUnited States-
dc.identifier.issnl1050-4729-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats