File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/LRA.2023.3269950
- Scopus: eid_2-s2.0-85158876310
- WOS: WOS:000981889200019
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: ESVIO: Event-Based Stereo Visual Inertial Odometry
Title | ESVIO: Event-Based Stereo Visual Inertial Odometry |
---|---|
Authors | |
Keywords | aerial systems: perception and autonomy sensor fusion Visual-Inertial SLAM |
Issue Date | 1-Jun-2023 |
Publisher | Institute of Electrical and Electronics Engineers |
Citation | IEEE Robotics and Automation Letters, 2023, v. 8, n. 6, p. 3661-3668 How to Cite? |
Abstract | Event cameras that asynchronously output low-latency event streams provide great opportunities for state estimation under challenging situations. Despite event-based visual odometry having been extensively studied in recent years, most of them are based on the monocular, while few research on stereo event vision. In this letter, we present ESVIO, the first event-based stereo visual-inertial odometry, which leverages the complementary advantages of event streams, standard images, and inertial measurements. Our proposed pipeline includes the ESIO (purely event-based) and ESVIO (event with image-aided), which achieves spatial and temporal associations between consecutive stereo event streams. A well-design back-end tightly-coupled fused the multi-sensor measurement to obtain robust state estimation. We validate that both ESIO and ESVIO have superior performance compared with other image-based and event-based baseline methods on public and self-collected datasets. Furthermore, we use our pipeline to perform onboard quadrotor flights under low-light environments. Autonomous driving data sequences and real-world large-scale experiments are also conducted to demonstrate long-term effectiveness. We highlight that this work is a real-time, accurate system that is aimed at robust state estimation under challenging environments. |
Persistent Identifier | http://hdl.handle.net/10722/337631 |
ISSN | 2023 Impact Factor: 4.6 2023 SCImago Journal Rankings: 2.119 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chen, PY | - |
dc.contributor.author | Guan, WP | - |
dc.contributor.author | Lu, P | - |
dc.date.accessioned | 2024-03-11T10:22:40Z | - |
dc.date.available | 2024-03-11T10:22:40Z | - |
dc.date.issued | 2023-06-01 | - |
dc.identifier.citation | IEEE Robotics and Automation Letters, 2023, v. 8, n. 6, p. 3661-3668 | - |
dc.identifier.issn | 2377-3766 | - |
dc.identifier.uri | http://hdl.handle.net/10722/337631 | - |
dc.description.abstract | <p>Event cameras that asynchronously output low-latency event streams provide great opportunities for state estimation under challenging situations. Despite event-based visual odometry having been extensively studied in recent years, most of them are based on the monocular, while few research on stereo event vision. In this letter, we present ESVIO, the first event-based stereo visual-inertial odometry, which leverages the complementary advantages of event streams, standard images, and inertial measurements. Our proposed pipeline includes the ESIO (purely event-based) and ESVIO (event with image-aided), which achieves spatial and temporal associations between consecutive stereo event streams. A well-design back-end tightly-coupled fused the multi-sensor measurement to obtain robust state estimation. We validate that both ESIO and ESVIO have superior performance compared with other image-based and event-based baseline methods on public and self-collected datasets. Furthermore, we use our pipeline to perform onboard quadrotor flights under low-light environments. Autonomous driving data sequences and real-world large-scale experiments are also conducted to demonstrate long-term effectiveness. We highlight that this work is a real-time, accurate system that is aimed at robust state estimation under challenging environments.</p> | - |
dc.language | eng | - |
dc.publisher | Institute of Electrical and Electronics Engineers | - |
dc.relation.ispartof | IEEE Robotics and Automation Letters | - |
dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
dc.subject | aerial systems: perception and autonomy | - |
dc.subject | sensor fusion | - |
dc.subject | Visual-Inertial SLAM | - |
dc.title | ESVIO: Event-Based Stereo Visual Inertial Odometry | - |
dc.type | Article | - |
dc.identifier.doi | 10.1109/LRA.2023.3269950 | - |
dc.identifier.scopus | eid_2-s2.0-85158876310 | - |
dc.identifier.volume | 8 | - |
dc.identifier.issue | 6 | - |
dc.identifier.spage | 3661 | - |
dc.identifier.epage | 3668 | - |
dc.identifier.eissn | 2377-3766 | - |
dc.identifier.isi | WOS:000981889200019 | - |
dc.publisher.place | PISCATAWAY | - |
dc.identifier.issnl | 2377-3766 | - |