File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/ICRA.2017.7989191
- Scopus: eid_2-s2.0-85024905678
- Find via
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: A hybrid deep architecture for robotic grasp detection
Title | A hybrid deep architecture for robotic grasp detection |
---|---|
Authors | |
Issue Date | 2017 |
Publisher | IEEE, Computer Society. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000639 |
Citation | Proceedings of 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May-3 June 2017, p. 1609-1614 How to Cite? |
Abstract | The robotic grasp detection is a great challenge in the area of robotics. Previous work mainly employs the visual approaches to solve this problem. In this paper, a hybrid deep architecture combining the visual and tactile sensing for robotic grasp detection is proposed. We have demonstrated that the visual sensing and tactile sensing are complementary to each other and important for the robotic grasping. A new THU grasp dataset has also been collected which contains the visual, tactile and grasp configuration information. The experiments conducted on a public grasp dataset and our collected dataset show that the performance of the proposed model is superior to state of the art methods. The results also indicate that the tactile data could help to enable the network to learn better visual features for the robotic grasp detection task. |
Persistent Identifier | http://hdl.handle.net/10722/261615 |
ISSN | 2023 SCImago Journal Rankings: 1.620 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Guo, D | - |
dc.contributor.author | Sun, F | - |
dc.contributor.author | Liu, H | - |
dc.contributor.author | Kong, T | - |
dc.contributor.author | Fang, B | - |
dc.contributor.author | Xi, N | - |
dc.date.accessioned | 2018-09-28T04:44:43Z | - |
dc.date.available | 2018-09-28T04:44:43Z | - |
dc.date.issued | 2017 | - |
dc.identifier.citation | Proceedings of 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May-3 June 2017, p. 1609-1614 | - |
dc.identifier.issn | 1050-4729 | - |
dc.identifier.uri | http://hdl.handle.net/10722/261615 | - |
dc.description.abstract | The robotic grasp detection is a great challenge in the area of robotics. Previous work mainly employs the visual approaches to solve this problem. In this paper, a hybrid deep architecture combining the visual and tactile sensing for robotic grasp detection is proposed. We have demonstrated that the visual sensing and tactile sensing are complementary to each other and important for the robotic grasping. A new THU grasp dataset has also been collected which contains the visual, tactile and grasp configuration information. The experiments conducted on a public grasp dataset and our collected dataset show that the performance of the proposed model is superior to state of the art methods. The results also indicate that the tactile data could help to enable the network to learn better visual features for the robotic grasp detection task. | - |
dc.language | eng | - |
dc.publisher | IEEE, Computer Society. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000639 | - |
dc.relation.ispartof | IEEE International Conference on Robotics and Automation | - |
dc.rights | IEEE International Conference on Robotics and Automation. Copyright © IEEE, Computer Society. | - |
dc.title | A hybrid deep architecture for robotic grasp detection | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Xi, N: xining@hku.hk | - |
dc.identifier.authority | Xi, N=rp02044 | - |
dc.identifier.doi | 10.1109/ICRA.2017.7989191 | - |
dc.identifier.scopus | eid_2-s2.0-85024905678 | - |
dc.identifier.hkuros | 292796 | - |
dc.identifier.spage | 1609 | - |
dc.identifier.epage | 1614 | - |
dc.publisher.place | United States | - |
dc.identifier.issnl | 1050-4729 | - |