File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/LRA.2020.2969932
- Scopus: eid_2-s2.0-85079797690
- WOS: WOS:000526520500007
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Rigid-Soft Interactive Learning for Robust Grasping
Title | Rigid-Soft Interactive Learning for Robust Grasping |
---|---|
Authors | |
Keywords | Robots Grasping Grippers Benchmark testing Learning systems |
Issue Date | 2020 |
Publisher | Institute of Electrical and Electronics Engineers. The Journal's web site is located at https://www.ieee.org/membership-catalog/productdetail/showProductDetailPage.html?product=PER481-ELE |
Citation | IEEE Robotics and Automation Letters, 2020, v. 5 n. 2, p. 1720-1727 How to Cite? |
Abstract | Robot learning is widely accepted by academia and industry with its potentials to transform autonomous robot control through machine learning. Inspired by widely used soft fingers on grasping, we propose a method of rigid-soft interactive learning, aiming at reducing the time of data collection. In this letter, we classify the interaction categories into Rigid-Rigid, Rigid-Soft, SoftRigid according to the interaction surface between grippers and target objects. We find experimental evidence that the interaction types between grippers and target objects play an essential role in the learning methods. We use soft, stuffed toys for training, instead of everyday objects, to reduce the integration complexity and computational burden. Although the stuffed toys are limited in reflecting the physics of finger-object interaction in real-life scenarios, we exploit such rigid-soft interaction by changing the gripper fingers to the soft ones when dealing with rigid, daily-life items such as the Yale-CMU-Berkeley (YCB) objects. With a small data collection of 5 K picking attempts in total, our results suggest that such Rigid-Soft and Soft-Rigid interactions are transferable. Moreover, the combination of such interactions shows better performance on the grasping test. We also explore the effect of the grasp type on the learning method by changing the gripper configurations. We achieve the best grasping performance at 97.5% for easy YCB objects and 81.3% for difficult YCB objects while using a precise grasp with a two-soft-finger gripper to collect training data and power grasp with a four-soft-finger gripper to test the grasp policy. |
Persistent Identifier | http://hdl.handle.net/10722/285106 |
ISSN | 2023 Impact Factor: 4.6 2023 SCImago Journal Rankings: 2.119 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | YANG, L | - |
dc.contributor.author | WAN, F | - |
dc.contributor.author | WANG, H | - |
dc.contributor.author | LIU, X | - |
dc.contributor.author | LIU, Y | - |
dc.contributor.author | Pan, J | - |
dc.contributor.author | SONG, C | - |
dc.date.accessioned | 2020-08-07T09:06:50Z | - |
dc.date.available | 2020-08-07T09:06:50Z | - |
dc.date.issued | 2020 | - |
dc.identifier.citation | IEEE Robotics and Automation Letters, 2020, v. 5 n. 2, p. 1720-1727 | - |
dc.identifier.issn | 2377-3766 | - |
dc.identifier.uri | http://hdl.handle.net/10722/285106 | - |
dc.description.abstract | Robot learning is widely accepted by academia and industry with its potentials to transform autonomous robot control through machine learning. Inspired by widely used soft fingers on grasping, we propose a method of rigid-soft interactive learning, aiming at reducing the time of data collection. In this letter, we classify the interaction categories into Rigid-Rigid, Rigid-Soft, SoftRigid according to the interaction surface between grippers and target objects. We find experimental evidence that the interaction types between grippers and target objects play an essential role in the learning methods. We use soft, stuffed toys for training, instead of everyday objects, to reduce the integration complexity and computational burden. Although the stuffed toys are limited in reflecting the physics of finger-object interaction in real-life scenarios, we exploit such rigid-soft interaction by changing the gripper fingers to the soft ones when dealing with rigid, daily-life items such as the Yale-CMU-Berkeley (YCB) objects. With a small data collection of 5 K picking attempts in total, our results suggest that such Rigid-Soft and Soft-Rigid interactions are transferable. Moreover, the combination of such interactions shows better performance on the grasping test. We also explore the effect of the grasp type on the learning method by changing the gripper configurations. We achieve the best grasping performance at 97.5% for easy YCB objects and 81.3% for difficult YCB objects while using a precise grasp with a two-soft-finger gripper to collect training data and power grasp with a four-soft-finger gripper to test the grasp policy. | - |
dc.language | eng | - |
dc.publisher | Institute of Electrical and Electronics Engineers. The Journal's web site is located at https://www.ieee.org/membership-catalog/productdetail/showProductDetailPage.html?product=PER481-ELE | - |
dc.relation.ispartof | IEEE Robotics and Automation Letters | - |
dc.rights | IEEE Robotics and Automation Letters. Copyright © Institute of Electrical and Electronics Engineers. | - |
dc.rights | ©20xx IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | - |
dc.subject | Robots | - |
dc.subject | Grasping | - |
dc.subject | Grippers | - |
dc.subject | Benchmark testing | - |
dc.subject | Learning systems | - |
dc.title | Rigid-Soft Interactive Learning for Robust Grasping | - |
dc.type | Article | - |
dc.identifier.email | Pan, J: jpan@cs.hku.hk | - |
dc.identifier.authority | Pan, J=rp01984 | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/LRA.2020.2969932 | - |
dc.identifier.scopus | eid_2-s2.0-85079797690 | - |
dc.identifier.hkuros | 312150 | - |
dc.identifier.volume | 5 | - |
dc.identifier.issue | 2 | - |
dc.identifier.spage | 1720 | - |
dc.identifier.epage | 1727 | - |
dc.identifier.isi | WOS:000526520500007 | - |
dc.publisher.place | United States | - |
dc.identifier.issnl | 2377-3766 | - |