File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: ESVIO: Event-Based Stereo Visual Inertial Odometry

TitleESVIO: Event-Based Stereo Visual Inertial Odometry
Authors
Keywordsaerial systems: perception and autonomy
sensor fusion
Visual-Inertial SLAM
Issue Date1-Jun-2023
PublisherInstitute of Electrical and Electronics Engineers
Citation
IEEE Robotics and Automation Letters, 2023, v. 8, n. 6, p. 3661-3668 How to Cite?
Abstract

Event cameras that asynchronously output low-latency event streams provide great opportunities for state estimation under challenging situations. Despite event-based visual odometry having been extensively studied in recent years, most of them are based on the monocular, while few research on stereo event vision. In this letter, we present ESVIO, the first event-based stereo visual-inertial odometry, which leverages the complementary advantages of event streams, standard images, and inertial measurements. Our proposed pipeline includes the ESIO (purely event-based) and ESVIO (event with image-aided), which achieves spatial and temporal associations between consecutive stereo event streams. A well-design back-end tightly-coupled fused the multi-sensor measurement to obtain robust state estimation. We validate that both ESIO and ESVIO have superior performance compared with other image-based and event-based baseline methods on public and self-collected datasets. Furthermore, we use our pipeline to perform onboard quadrotor flights under low-light environments. Autonomous driving data sequences and real-world large-scale experiments are also conducted to demonstrate long-term effectiveness. We highlight that this work is a real-time, accurate system that is aimed at robust state estimation under challenging environments.


Persistent Identifierhttp://hdl.handle.net/10722/337631
ISSN
2021 Impact Factor: 4.321
2020 SCImago Journal Rankings: 1.123
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorChen, PY-
dc.contributor.authorGuan, WP-
dc.contributor.authorLu, P-
dc.date.accessioned2024-03-11T10:22:40Z-
dc.date.available2024-03-11T10:22:40Z-
dc.date.issued2023-06-01-
dc.identifier.citationIEEE Robotics and Automation Letters, 2023, v. 8, n. 6, p. 3661-3668-
dc.identifier.issn2377-3766-
dc.identifier.urihttp://hdl.handle.net/10722/337631-
dc.description.abstract<p>Event cameras that asynchronously output low-latency event streams provide great opportunities for state estimation under challenging situations. Despite event-based visual odometry having been extensively studied in recent years, most of them are based on the monocular, while few research on stereo event vision. In this letter, we present ESVIO, the first event-based stereo visual-inertial odometry, which leverages the complementary advantages of event streams, standard images, and inertial measurements. Our proposed pipeline includes the ESIO (purely event-based) and ESVIO (event with image-aided), which achieves spatial and temporal associations between consecutive stereo event streams. A well-design back-end tightly-coupled fused the multi-sensor measurement to obtain robust state estimation. We validate that both ESIO and ESVIO have superior performance compared with other image-based and event-based baseline methods on public and self-collected datasets. Furthermore, we use our pipeline to perform onboard quadrotor flights under low-light environments. Autonomous driving data sequences and real-world large-scale experiments are also conducted to demonstrate long-term effectiveness. We highlight that this work is a real-time, accurate system that is aimed at robust state estimation under challenging environments.</p>-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.relation.ispartofIEEE Robotics and Automation Letters-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectaerial systems: perception and autonomy-
dc.subjectsensor fusion-
dc.subjectVisual-Inertial SLAM-
dc.titleESVIO: Event-Based Stereo Visual Inertial Odometry-
dc.typeArticle-
dc.identifier.doi10.1109/LRA.2023.3269950-
dc.identifier.scopuseid_2-s2.0-85158876310-
dc.identifier.volume8-
dc.identifier.issue6-
dc.identifier.spage3661-
dc.identifier.epage3668-
dc.identifier.eissn2377-3766-
dc.identifier.isiWOS:000981889200019-
dc.publisher.placePISCATAWAY-
dc.identifier.issnl2377-3766-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats