File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/IROS47612.2022.9981970
- Scopus: eid_2-s2.0-85139074653
- WOS: WOS:000908368202008
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: Monocular Event Visual Inertial Odometry based on Event-corner using Sliding Windows Graph-based Optimization
Title | Monocular Event Visual Inertial Odometry based on Event-corner using Sliding Windows Graph-based Optimization |
---|---|
Authors | |
Issue Date | 23-Oct-2022 |
Publisher | IEEE |
Abstract | Event cameras are biologically-inspired vision sensors that capture pixel-level illumination changes instead of the intensity image at a fixed frame rate. They offer many advantages over the standard cameras, such as high dynamic range, high temporal resolution (low latency), no motion blur, etc. Therefore, developing state estimation algorithms based on event cameras offers exciting opportunities for autonomous systems and robots. In this paper, we propose monocular visual-inertial odometry for event cameras based on event-corner feature detection and matching with well-designed feature management. More specifically, two different kinds of event representations based on time surface are designed to realize event-corner feature tracking (for front-end incremental estimation) and matching (for loop closure detection). Furthermore, the proposed event representations are used to set mask for detecting the event-corner feature based on the raw event-stream, which ensures the uniformly distributed and spatial consistency characteristic of the event-corner feature. Finally, a tightly coupled, graph-based optimization framework is designed to obtain high-accurate state estimation through fusing pre-integrated IMU measurements and event-corner observations. We validate quantitatively the performance of our system on different resolution event cameras: DAVIS240C (240*180, public dataset, achieve state-of-the-art), DAVIS346 (346*240, real-test), DVXplorer (640*480 real-test). Furthermore, we demonstrate qualitatively the accuracy, robustness, loop closure, and re-localization performance of our framework on different large-scale datasets, and an autonomous quadrotor flight using our Event Visual-inertial Odometry (EVIO) framework. Videos of all the evaluations are presented on the project website. |
Persistent Identifier | http://hdl.handle.net/10722/339568 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Guan, Weipeng | - |
dc.contributor.author | Lu, Peng | - |
dc.date.accessioned | 2024-03-11T10:37:42Z | - |
dc.date.available | 2024-03-11T10:37:42Z | - |
dc.date.issued | 2022-10-23 | - |
dc.identifier.uri | http://hdl.handle.net/10722/339568 | - |
dc.description.abstract | <p>Event cameras are biologically-inspired vision sensors that capture pixel-level illumination changes instead of the intensity image at a fixed frame rate. They offer many advantages over the standard cameras, such as high dynamic range, high temporal resolution (low latency), no motion blur, etc. Therefore, developing state estimation algorithms based on event cameras offers exciting opportunities for autonomous systems and robots. In this paper, we propose monocular visual-inertial odometry for event cameras based on event-corner feature detection and matching with well-designed feature management. More specifically, two different kinds of event representations based on time surface are designed to realize event-corner feature tracking (for front-end incremental estimation) and matching (for loop closure detection). Furthermore, the proposed event representations are used to set mask for detecting the event-corner feature based on the raw event-stream, which ensures the uniformly distributed and spatial consistency characteristic of the event-corner feature. Finally, a tightly coupled, graph-based optimization framework is designed to obtain high-accurate state estimation through fusing pre-integrated IMU measurements and event-corner observations. We validate quantitatively the performance of our system on different resolution event cameras: DAVIS240C (240*180, public dataset, achieve state-of-the-art), DAVIS346 (346*240, real-test), DVXplorer (640*480 real-test). Furthermore, we demonstrate qualitatively the accuracy, robustness, loop closure, and re-localization performance of our framework on different large-scale datasets, and an autonomous quadrotor flight using our Event Visual-inertial Odometry (EVIO) framework. Videos of all the evaluations are presented on the project website.<br></p> | - |
dc.language | eng | - |
dc.publisher | IEEE | - |
dc.relation.ispartof | 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems - IROS (23/10/2022-27/10/2022, , , Kyoto, Japan) | - |
dc.title | Monocular Event Visual Inertial Odometry based on Event-corner using Sliding Windows Graph-based Optimization | - |
dc.type | Conference_Paper | - |
dc.identifier.doi | 10.1109/IROS47612.2022.9981970 | - |
dc.identifier.scopus | eid_2-s2.0-85139074653 | - |
dc.identifier.volume | 2022-October | - |
dc.identifier.spage | 2438 | - |
dc.identifier.epage | 2445 | - |
dc.identifier.isi | WOS:000908368202008 | - |