File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: FAST-LIVO2: Fast, Direct LiDAR-Inertial-Visual Odometry

TitleFAST-LIVO2: Fast, Direct LiDAR-Inertial-Visual Odometry
Authors
Keywords3D reconstruction
aerial navigation
sensor fusion
simultaneous localization and mapping (SLAM)
Issue Date19-Nov-2024
PublisherInstitute of Electrical and Electronics Engineers
Citation
IEEE Transactions on Robotics, 2024, v. 41, p. 326-346 How to Cite?
AbstractThis paper proposes FAST-LIVO2: a fast, direct LiDAR-inertial-visual odometry framework to achieve accurate and robust state estimation in Simultaneous Localization and Mapping (SLAM) tasks and provide great potential in real-time, onboard robotic applications. FAST-LIVO2 fuses the IMU, LiDAR and image measurements, efficiently through an error-state iterated Kalman filter (ESIKF). To address the dimension mismatch between the heterogeneous LiDAR and image measurements, we use a sequential update strategy in the Kalman filter. To enhance the efficiency, we use direct methods for both the visual and LiDAR fusion, where the LiDAR module registers raw points without extracting edge or plane features and the visual module minimizes direct photometric errors without extracting ORB or FAST corner features. The fusion of both visual and LiDAR measurements is based on a single unified voxel map where the LiDAR module constructs the geometric structure for registering new LiDAR scans and the visual module attaches image patches to the LiDAR points (i.e., visual map points) enabling new image alignment. To enhance the accuracy of image alignment, we use plane priors from the LiDAR points in the voxel map (and even refine the plane prior in the alignment process) and update the reference patch dynamically after new images are aligned. Furthermore, to enhance the robustness of image alignment, FAST-LIVO2 employs an on-demanding raycast operation and estimates the image exposure time in real time. We conduct extensive experiments on both benchmark and private datasets, demonstrating that our proposed system significantly outperforms other state-of-the-art odometry systems in terms of accuracy, robustness, and computation efficiency. Moreover, the effectiveness of key modules in the system is also validated. Lastly, we detail three applications of FAST-LIVO2: UAV onboard navigation demonstrating the system's computation efficiency for real-time onboard navigation, airborne mapping showcasing the system's mapping accuracy, and 3D model rendering (mesh-based and NeRF-based) underscoring the suitability of our reconstructed dense map for subsequent rendering tasks. We open source our code, dataset and application of this work on GitHub to benefit the robotics community.
Persistent Identifierhttp://hdl.handle.net/10722/357442
ISSN
2023 Impact Factor: 9.4
2023 SCImago Journal Rankings: 3.669
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorZheng, Chunran-
dc.contributor.authorXu, Wei-
dc.contributor.authorZou, Zuhao-
dc.contributor.authorHua, Tong-
dc.contributor.authorYuan, Chongjian-
dc.contributor.authorHe, Dongjiao-
dc.contributor.authorZhou, Bingyang-
dc.contributor.authorLiu, Zheng-
dc.contributor.authorLin, Jiarong-
dc.contributor.authorZhu, Fangcheng-
dc.contributor.authorRen, Yunfan-
dc.contributor.authorWang, Rong-
dc.contributor.authorMeng, Fanle-
dc.contributor.authorZhang, Fu-
dc.date.accessioned2025-06-26T00:30:02Z-
dc.date.available2025-06-26T00:30:02Z-
dc.date.issued2024-11-19-
dc.identifier.citationIEEE Transactions on Robotics, 2024, v. 41, p. 326-346-
dc.identifier.issn1552-3098-
dc.identifier.urihttp://hdl.handle.net/10722/357442-
dc.description.abstractThis paper proposes FAST-LIVO2: a fast, direct LiDAR-inertial-visual odometry framework to achieve accurate and robust state estimation in Simultaneous Localization and Mapping (SLAM) tasks and provide great potential in real-time, onboard robotic applications. FAST-LIVO2 fuses the IMU, LiDAR and image measurements, efficiently through an error-state iterated Kalman filter (ESIKF). To address the dimension mismatch between the heterogeneous LiDAR and image measurements, we use a sequential update strategy in the Kalman filter. To enhance the efficiency, we use direct methods for both the visual and LiDAR fusion, where the LiDAR module registers raw points without extracting edge or plane features and the visual module minimizes direct photometric errors without extracting ORB or FAST corner features. The fusion of both visual and LiDAR measurements is based on a single unified voxel map where the LiDAR module constructs the geometric structure for registering new LiDAR scans and the visual module attaches image patches to the LiDAR points (i.e., visual map points) enabling new image alignment. To enhance the accuracy of image alignment, we use plane priors from the LiDAR points in the voxel map (and even refine the plane prior in the alignment process) and update the reference patch dynamically after new images are aligned. Furthermore, to enhance the robustness of image alignment, FAST-LIVO2 employs an on-demanding raycast operation and estimates the image exposure time in real time. We conduct extensive experiments on both benchmark and private datasets, demonstrating that our proposed system significantly outperforms other state-of-the-art odometry systems in terms of accuracy, robustness, and computation efficiency. Moreover, the effectiveness of key modules in the system is also validated. Lastly, we detail three applications of FAST-LIVO2: UAV onboard navigation demonstrating the system's computation efficiency for real-time onboard navigation, airborne mapping showcasing the system's mapping accuracy, and 3D model rendering (mesh-based and NeRF-based) underscoring the suitability of our reconstructed dense map for subsequent rendering tasks. We open source our code, dataset and application of this work on GitHub to benefit the robotics community.-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.relation.ispartofIEEE Transactions on Robotics-
dc.subject3D reconstruction-
dc.subjectaerial navigation-
dc.subjectsensor fusion-
dc.subjectsimultaneous localization and mapping (SLAM)-
dc.titleFAST-LIVO2: Fast, Direct LiDAR-Inertial-Visual Odometry-
dc.typeArticle-
dc.identifier.doi10.1109/TRO.2024.3502198-
dc.identifier.scopuseid_2-s2.0-85210406571-
dc.identifier.volume41-
dc.identifier.spage326-
dc.identifier.epage346-
dc.identifier.eissn1941-0468-
dc.identifier.isiWOS:001375722500008-
dc.identifier.issnl1552-3098-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats