File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
  • Find via Find It@HKUL
Supplementary

Article: R2LIVE: A Robust, Real-time, LiDAR-Inertial-Visual tightly-coupled state Estimator and mapping

TitleR2LIVE: A Robust, Real-time, LiDAR-Inertial-Visual tightly-coupled state Estimator and mapping
Authors
Issue Date2021
PublisherInstitute of Electrical and Electronics Engineers. The Journal's web site is located at https://www.ieee.org/membership-catalog/productdetail/showProductDetailPage.html?product=PER481-ELE
Citation
IEEE Robotics and Automation Letters, Forthcoming How to Cite?
AbstractIn this letter, we propose a robust, real-time tightlycoupled multi-sensor fusion framework, which fuses measurement from LiDAR, inertial sensor, and visual camera to achieve robust and accurate state estimation. Our proposed framework is composed of two parts: the filter-based odometry and factor graph optimization. To guarantee real-time performance, we estimate the state within the framework of error-state iterated Kalman-filter, and further improve the overall precision with our factor graph optimization. Taking advantage of measurement from all individual sensors, our algorithm is robust enough to various visual failure, LiDAR-degenerated scenarios, and is able to run in real-time on an on-board computation platform, as shown by extensive experiments conducted in indoor, outdoor, and mixed environment of different scale. Moreover, the results show that our proposed framework can improve the accuracy of state-of-the-art LiDAR-inertial or visual-inertial odometry. To share our findings and to make contributions to the community, we open source our codes on our Github.
Persistent Identifierhttp://hdl.handle.net/10722/301546
ISSN
2021 Impact Factor: 4.321
2020 SCImago Journal Rankings: 1.123

 

DC FieldValueLanguage
dc.contributor.authorLin, J-
dc.contributor.authorZheng, C-
dc.contributor.authorXu, W-
dc.contributor.authorZhang, F-
dc.date.accessioned2021-08-09T03:40:39Z-
dc.date.available2021-08-09T03:40:39Z-
dc.date.issued2021-
dc.identifier.citationIEEE Robotics and Automation Letters, Forthcoming-
dc.identifier.issn2377-3766-
dc.identifier.urihttp://hdl.handle.net/10722/301546-
dc.description.abstractIn this letter, we propose a robust, real-time tightlycoupled multi-sensor fusion framework, which fuses measurement from LiDAR, inertial sensor, and visual camera to achieve robust and accurate state estimation. Our proposed framework is composed of two parts: the filter-based odometry and factor graph optimization. To guarantee real-time performance, we estimate the state within the framework of error-state iterated Kalman-filter, and further improve the overall precision with our factor graph optimization. Taking advantage of measurement from all individual sensors, our algorithm is robust enough to various visual failure, LiDAR-degenerated scenarios, and is able to run in real-time on an on-board computation platform, as shown by extensive experiments conducted in indoor, outdoor, and mixed environment of different scale. Moreover, the results show that our proposed framework can improve the accuracy of state-of-the-art LiDAR-inertial or visual-inertial odometry. To share our findings and to make contributions to the community, we open source our codes on our Github.-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers. The Journal's web site is located at https://www.ieee.org/membership-catalog/productdetail/showProductDetailPage.html?product=PER481-ELE-
dc.relation.ispartofIEEE Robotics and Automation Letters-
dc.rightsIEEE Robotics and Automation Letters. Copyright © Institute of Electrical and Electronics Engineers.-
dc.rights©20xx IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.-
dc.titleR2LIVE: A Robust, Real-time, LiDAR-Inertial-Visual tightly-coupled state Estimator and mapping-
dc.typeArticle-
dc.identifier.emailZhang, F: fuzhang@hku.hk-
dc.identifier.authorityZhang, F=rp02460-
dc.identifier.hkuros324129-
dc.identifier.volumeForthcoming-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats